Progress Update:
I completed the Swipe Gesture Recognition as planned last report. I would need to test in a kitchen environment with our UI to further improve the gesture. This will be done once our UI is programmed.
I researched algorithms for object tracking along with some suggestions from Prof. Marios. I will be using the SIFT algorithm to track objects using their features. Since the object will be in a known location in the “first frame”, we will have an easy reference image to start tracking. In my research, I found a video by Shree K. Nayar explaining how feature tracking works. I plan to use this as a basis for my implementation.
Before attempting Nayar’s implementation, I attempted some tracker libraries that already exist in opencv. Specifically, I tried the KCF (Kernalized Correlation Filter) to track a couple objects. I followed the example by Khwab Kalra. I found that the KCF algorithm works great for easily identifiable and slow moving objects such as a car on a road. But it struggled to track a mouse moving across a screen. I’m not sure why this is the case yet and have much more testing to do with various example videos. OpenCV has 8 different trackers all with different specializations. I will test each of these trackers next week to see which works best. If they are not robust enough, I plan to use Nayar’s implemention with SIFT.
Link to Car tracking using KCF: https://drive.google.com/file/d/1zUjeYSGWuIXzmaCbMIEO1zvzxUYZY6Lv/view?usp=sharing
Link to Mouse tracking using KCF:
https://drive.google.com/file/d/1zUjeYSGWuIXzmaCbMIEO1zvzxUYZY6Lv/view?usp=sharing
As for the AGX, I have received confirmation from Prof. Marios’s grad students that it has been flashed and ready to be used.
Schedule Status:
On track!
Next Week Plans:
- Implement and improve the 8 OpenCV trackers by next weekend. Select the tracker that works best for this project.
- Implement SIFT for object tracking if OpenCV trackers are not good
- If above step is complete, run algorithm on a real time video stream.