Friday 4 January 2019

Running Ai Scripts on the Nvidia Jetson TK1 + Movidius Neural Compute Stick

So, what now that we have built our Ai Franken-computer? Now that we are free from Wall plug sockets and are now battery mobile? I guess we should run some scripts for the Neural compute stick.

In this video we see the DIY ai computer we built running an AI script which captures live streaming wireless camera and detecting objects in the video at a very fast, and smooth rate (Video30fps/Latency Less Than 0.08secs, Object detection is faster than 60fps)

We should go ahead and install Intel code examples from NCAPPZOO to allow us to test and experiment more than just checking to see if the the NCS is connected. Remembering that in our previous article we changed the ncsdk.conf file to allow us to have the compute stick running on ARM devices, now we

Install ncappzoo examples:
cd /home/ubuntu/workspace/ncsdk
git clone https://github.com/movidius/ncappzoo
cd /home/pi/workspace/ncsdk/ncappzoo
make all

Some examples will fail, but these will be the ones using tensorflow, we are not using it right now.

Once installed and compiled, we can look at how to run code examples.

If we try to run many examples we are usually presented with error messages. It usually means that we need a 'graph' file for whatever ai we are using (Googlenets/Squeezenets/Yolo/Facenets/Etc), each needs a graph file which is missing on ARM platforms. We need to use a Ubuntu 16.04 Laptop and make install the full NCSDK and make all examples on it. This will then create the graph files. Go ahead and look in the NCSDK folder on the laptop, copy them to a usb stick and transfer them to the TK1 in the same folder in NCAPPZOO/Caffe.

Running the examples, we can now see them working. In the video we are running stream_infer.py. We can use stream_infer.py to allow us to experiment with different image classifiers such as:
1)AlexNet
2)GenderNet
3)GoogleNet
4)SqueezeNet

We can also add our own image classifiers such as SSDMobileNet or YoloV2/V3 to do this we will cover this in a future article

Using Stream_infer.py script also allows us to experiment with:
1) a video file (mp4)
2) USB Webcam
3) DIY Wireless Streaming Camera


I built my Wireless Streaming Camera using:
1) Raspberry Pi Zero W
2) Raspberry Pi Zero Case (from Vstone Robot Shop in Tokyo - Akihabara)
3) Wide angle camera phone lens from Poundland/Dollar Store/100 Yen shop
4) 5v Power Bank (Any will do)

The wireless streaming camera allows us to walk around with it and capture and classify objects within a radius of 100m, or with a wifi repeater I get 600m in open space. I can also mount it to a drone or RC airplane to fly over spaces and classify objects.



 



Next article I will show how to stream wireless camera to Ai TK1 computer

Thanks for reading


No comments: