So, to explore a little further with live video streaming onto Jetson TK1, I am able to track objects seen from the drone camera.
This is using opencv, the code is written using python script, to track an orange fruit. the code can be changed depending on the color of the object, for example a green apple.
Here is footage of the drone
As time progresses I hope to allow dynamic object tracking selection options.
Things I might use this for include:
- Follow object (ground object or another vehicle in flight)
- Precision landing
- Landing onto moving object
- Payload deployment
- Camera Trigger
- Beacon Launch/Landing/Waypoint
- Drone racing gate targeting
OpenCV runs much more quickly on the Jetson board, than compared to a full Convolutional Neural Network which I can only get 7fps using. With opencv I can get 30fps, again with 0.1sec latency between drone and Linux computer.
OpenCV is kinda neat as an object tracking feature, but I think full-blown CNNs will take over and allow us to be more specific in which object we want to identify and track.
Thanks for reading.