Friday, 4 January 2019

Running Ai Scripts on the Nvidia Jetson TK1 + Movidius Neural Compute Stick

So, what now that we have built our Ai Franken-computer? Now that we are free from Wall plug sockets and are now battery mobile? I guess we should run some scripts for the Neural compute stick.

In this video we see the DIY ai computer we built running an AI script which captures live streaming wireless camera and detecting objects in the video at a very fast, and smooth rate (Video30fps/Latency Less Than 0.08secs, Object detection is faster than 60fps)

We should go ahead and install Intel code examples from NCAPPZOO to allow us to test and experiment more than just checking to see if the the NCS is connected. Remembering that in our previous article we changed the ncsdk.conf file to allow us to have the compute stick running on ARM devices, now we

Install ncappzoo examples:
cd /home/ubuntu/workspace/ncsdk
git clone https://github.com/movidius/ncappzoo
cd /home/pi/workspace/ncsdk/ncappzoo
make all

Some examples will fail, but these will be the ones using tensorflow, we are not using it right now.

Once installed and compiled, we can look at how to run code examples.

If we try to run many examples we are usually presented with error messages. It usually means that we need a 'graph' file for whatever ai we are using (Googlenets/Squeezenets/Yolo/Facenets/Etc), each needs a graph file which is missing on ARM platforms. We need to use a Ubuntu 16.04 Laptop and make install the full NCSDK and make all examples on it. This will then create the graph files. Go ahead and look in the NCSDK folder on the laptop, copy them to a usb stick and transfer them to the TK1 in the same folder in NCAPPZOO/Caffe.

Running the examples, we can now see them working. In the video we are running stream_infer.py. We can use stream_infer.py to allow us to experiment with different image classifiers such as:
1)AlexNet
2)GenderNet
3)GoogleNet
4)SqueezeNet

We can also add our own image classifiers such as SSDMobileNet or YoloV2/V3 to do this we will cover this in a future article

Using Stream_infer.py script also allows us to experiment with:
1) a video file (mp4)
2) USB Webcam
3) DIY Wireless Streaming Camera


I built my Wireless Streaming Camera using:
1) Raspberry Pi Zero W
2) Raspberry Pi Zero Case (from Vstone Robot Shop in Tokyo - Akihabara)
3) Wide angle camera phone lens from Poundland/Dollar Store/100 Yen shop
4) 5v Power Bank (Any will do)

The wireless streaming camera allows us to walk around with it and capture and classify objects within a radius of 100m, or with a wifi repeater I get 600m in open space. I can also mount it to a drone or RC airplane to fly over spaces and classify objects.



 



Next article I will show how to stream wireless camera to Ai TK1 computer

Thanks for reading


Using Movidius Neural Compute Stick with Nvidia Jetson TK1

Here I show how to use the Movidius Neural Compute Stick with the Nvidia TK1 Board.


Most of us are ready to throw the Jetson TK1 into the trash. It doesn't really do much. But if we update the software to Ubuntu 16.04 it might allow us to use the Movidius Neural Compute Stick with it's USB 3.0 port.



First up, after updating to 16.04, I tried to install the standard NCSDK

Get Started:
mkdir -p ~/workspace
cd ~/workspace
git clone https://github.com/movidius/ncsdk.git
cd ~/workspace/ncsdk
make install

Make the Examples:
cd ~/workspace/ncsdk
make examples

Test it's Working:
cd /home/ubuntu/workspace/ncsdk/examples/apps/hello_ncs_py
python3 hello_ncs.py

Should Give:
"Hello NCS! Device opened normally.
Goodbye NCS! Device closed normally.
NCS device working."

However It doesn't work like this for ARMv7 devices.

We need to follow the Raspberry Pi method of installing Neural Compute stick. This means that we cannot install:
1) Full NCSDK software
2) Tensorflow

So before making examples, we have to edit ncsdk.conf file. Find it's location, and open it in text editor.

Original:
MAKE_PROCS=1
SETUPDIR=/opt/movidius
VERBOSE=yes
SYSTEM_INSTALL=yes
CAFFE_FLAVOR=ssd
CAFFE_USE_CUDA=no
INSTALL_TENSORFLOW=yes
INSTALL_TOOLKIT=yes

New Edited:
MAKE_PROCS=1
SETUPDIR=/opt/movidius
VERBOSE=yes
SYSTEM_INSTALL=yes
CAFFE_FLAVOR=ssd
CAFFE_USE_CUDA=no
INSTALL_TENSORFLOW=no
INSTALL_TOOLKIT=no

Now rerun:
cd ~/workspace/ncsdk

make examples

This allows us to now re run the test to see if it is working:
cd /home/ubuntu/workspace/ncsdk/examples/apps/hello_ncs_py
python3 hello_ncs.py

We should have a good result now! It is connected :)

This process should be the same also for the Raspberry pi 2/3 using the latest Raspbian software which is equivalent to Ubuntu 16.04

Now how to run examples?

Next Article I show how to run example code on the TK1+Neural Compute Stick in a clever way.

Tegra Hardware Specs on 16.04

System Information when running Ai Python Code on the TK1/Neural Compute Stick. It runs much much much more smoothly than the Raspberry pi or the former TK1 14.04 board, and is all suddenly back in the race again without spending anything.
How to beat the tech AI spending race...

DIY Ghetto Ai Development Computer

Here I am showing the next article, how I built a cheap Ai Development computer using spare parts, cardboard, duct tape, and a lipo battery.

On the back we see TK1, on the right Raspberry Pi (I can switch over to), Lipo, Movidius Neural Compute Stick. All working together to make fast AI video detection.

Ingredients

1) Old Nvidia TK1 Board updated to Ubuntu 16.04
2) 17" Laptop LCD Screen with 32 pin edp socket
3) Old Lipo Battery


4) Pocket Keyboard from Raspberry Pi


5) Movidius Neural Compute Stick


6) Duct Tape
7) Cardboard

Mash it all together

And you get this:

DIY ai Computer. Sorry for being lazy on this article but things move fast, money is tight and I have to keep going.


Next article is how to connect Movidius Neural Compute Stick to Jetson TK1.

Thanks for reading

Upgrading Nvidia Jetson TK1 from 14.04 to 16.04

This is a blog which will continue with further posts that leads along a path of hacking, modding, updates and general tinkering.

Since I have had my Movidius Neural Compute Stick I haven't really touched my Nvidia Jetson TK1 board for a while. It has just been collecting dust.

Recently I thought what if I could connect both the TK1 with the Neural Compute Stick?

So I gave it a shot.

Hey, if the Raspberry Pi can handle the Movidius Neural Compute Stick....So can TK1.


Very quickly I discovered that the TK1 board is incompatible due the internet telling us the TK1 uses Ubuntu 14.04 and is 32bit board.

But I don't quit.

Here I found that some have updated the TK1 to 16.04>>>> LINK

They recommend using a clean install of 14.04 before updating to 16.04 as the TK1 only has 16GB of emmc storage and if the update procedure exceeds 16GB, the device will brick, and you will have to connect it to a laptop using usb and compile 14.04 again which will take forever.

I just gave it a shot anyway. I deleted cuda examples, and all large folders, downloads folder contents, everything that is not system files. This left me with 5GB emmc space with which to update.

I followed the commands as per guide:
  1. sudo add-apt-repository main
  2. sudo add-apt-repository universe
  3. sudo add-apt-repository multiverse
  4. sudo add-apt-repository restricted
  5. sudo apt-get update && sudo apt-get upgrade
  6. sudo apt-mark hold xserver-xorg-core
  7. sudo do-release-upgrade
  8. sudo apt-get install gnome-session-flashback

I was very patient, I panicked a couple of times convinced it had bricked, but eventually I booted into Ubuntu 16.04 Xenial on the Jetson TK1. Cool. Now this showed a whole bunch of improvements, I could install Chromium and it ran better than before allowing youtube 1080p and video streaming, spotify, and ublock origin adblocking which was not available on 14.04 before.


Perhaps the TK1 isnt going into the box on the shelf just yet. Perhaps I don't need to blow thousands on the New Nvidia Xavier Board.


So this made me happy. I hope it works for you too.

Next article I will build something with it.

Wednesday, 21 November 2018

Tokyo

Hello I am in Tokyo
I have been here for a while now exploring Japan's robotics technology and trying my best to look far beyond just Yodobashi Camera 😀

I will be back soon with some nice updates.

Tuesday, 9 October 2018

Ai Driven Ground Station


Robotics ground stations. They need a little bit of a revamp. Already they are stuck in the days of 2008 where Atmel micro-controllers were the shiz-nizzle. Today in 2018 we're seeking more advanced ideas and creations and that old ground station used for your drone back in 2008 just doesn't combine the cutting-edge anymore.



What users need in a modern ground system seems to be leaning towards one or two trends:
1) Cross Platform - So why not write the code using Python instead of using an IDE? Users today want to utilise their latest Board Computers (Raspberry Pis, Nvidia Jetsons) which are all using Linux. Users are using Macbooks to program, so why not make it useable on that too with Python script? More and more Users are also using Mini PC platforms too (Intel Nuc,  Beelink X55) which are running Windows 10 or Ubuntu on a tiny 12v motherboard with everything in it.

Times are changing. It's no longer just Windows Laptops or Desktops.

UPDATE October 15th: Free Python/Ai/TKinter Lessons at Udemy.com Use Code: OCT_SPL

2) Live Digital Video - Ground stations are still struggling to get off the ground with a good open Live digital video standard. It's also still not very user friendly nor quick to get going.

3) AI Integration - Let's face it, this is where we all want to go. We want to experiment and learn in a practical way, how to utilise all the Artificial Intelligence technologies that are available but still no ground station is out there to provide this opportunity in one easy to start package. It just is not out there.

So why don't we make our own? So I did.



Here is the stage that I am at for my Raspberry Pi Ground Station (rpigs)

In time I see no reason why rpigs cannot include a host of different ways to use and display new technologies: OpenCV object tracking module, Line detection following, Lidar mapping in Real Time, Sense and Avoid Module; just add another Neural Compute Stick for each task.

I wrote this in Python with a few addon dependencies. It works on Ubuntu Laptops, 5v Raspberry Pi, Macbook, Windows PC,  everything. Everything. EVERYTHING. Cross-Platform. Everything which uses Python can use it.

Even my Home made 12v Computer which I made from a 16" Laptop Screen and Cardboard Box.


I really love my Ghetto Computer. I can run any Board computer with it giving a lightweight, low-power system which can be upgraded and modified in a more straight forward manner than most sealed modern laptops.

Reviving the Hacktop Computer since 2018.

So, this is why it is useful, haven't you always dreamed of identifying objects as your drone flies over a landscape? Isn't it a fun thing to then be able to assign an action so that once the object has been identified we can learn and decide what course of action we choose it to take? It's a really fun possibility.

I hope to one day have my mini brushed motor plane modified into a UAV and this is part of that idea.



rpigs (Raspberry Pi Ground Station) V1_0 incorporates digital video or analogue video, telemetry data and commands, map view, and Ai. It is in development and has already begun. I hope it is one day good enough to release to the general public for free under open license. I hope that companies an developers will support me with their technology to allow me to continue building on it. Thanks already to Intel & Movidius for helping me with the Neural Compute Stick I wouldn't be here today without your help, and I hope you can continue to support.

Thanks for reading.

Tuesday, 4 September 2018

Cheap Portable TV/PC/Gaming Screen Using T.V56.03 Board

Greetings,
I am making my own 12v TV for my Hippy Van to allow me to have Linux development when out on the road, also some movies and gaming but I don't really do that much these days. I looked up ready made 12v portable TV/Monitors from ASUS and Gechic, figured I could make my own for about $25, I'm almost there, here's how it has gone...


I have just got this TV Driver Board (T.V56.03 + TSUMV56RUU-Z1 Chip) and I try using it with Screen (Innolux N156HGE-EA2 rev C2)

When I power the board + Screen connected it working OK for like 1 minute. Then after this, it goes ON/OFF/ON/OFF repeating. ON 1 Second, OFF 1 Second so on.

My question, is this Board problem? Firmware problem? Overheat problem? Chip damaged/dying? Not enough power from 5a max power supply?

I am using 12v 5amp supply

Board PDF file added>T_V56_03
Screen PDF file>https://datasheetspdf.com/pdf-file/1093520/INNOLUX/N156HGE-EA2/1
My Board Look like this>


Screen Details:

Heres a video of what is going on >


My Progress Updates trying to Eliminate & Fix:

1) It doesn't seem to be a power supply issue. I hooked it up to a 30amp 12v supply and the same thing happened as in the video.
2) Found this general User Guide PDF> Users guide of V59 TV con...
3) Github Repo of Firmware & Manuals and other things HERE.

4)Success! 
After reading the datasheets for both the Screen & T.V56.03 Board, Reading the User Guide, testing the PSU, having a cup of tea and a little think, it appears to now be working well without cutting out. It seems that the board was sent to me with the LVDS Voltage set to 5v, according to the Screen Panel datasheet, it should be 3.3v so I changed the little Voltage Jumper on the controller board as in this picture>


It seems all is ok, I haven't fried the screen using the initial 5v setting (thank goodness), and the screen is powering without cutting out (so far). 

Reason To Be Cheerful:
1) It is a 16" 1080p TV that runs on 12v Battery

2) It is a Nifty Portable Monitor for Linux/Windows/Osx again running on 12v
3) Using a USB stick filled with media files it will recognise and play Movies/Photos/Music. Sadly it refuses to play modern Rap Music or any pink haired female Music.
4) It can be used as a large FPV screen for flight
5) Plug in any SOC computer and you have Internet access
6) KODI in the Campervan
7) RetroPi gaming on the Go

Thanks for the replies and I hope this helps others