Steven’s Status Report for March 8

What did you personally accomplish this week on the project?

I worked on integrating the C++ api for OpenPose in our application, and did some fine tuning for performance and accuracy. Keypoints are now available to use in our application for eye tracking and gesture control. I also did some research for gesture recognition algorithms. I think a good starting point is having the purely based on the velocity of the keypoint (ex. Left hand moves quickly to the right).

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

Roughly on schedule. I think with the OpenPose now integrated into the application, developing gesture control should be simple.

What deliverables do you hope to complete in the next week?

Complete gesture control algorithm. Also, I have yet to compile the project on the Jetson.

Steven’s Status Report for Feb22

What did you personally accomplish this week on the project?

I worked on the code for eye-tracking and gesture-recognition. I managed to build the C++ API for OpenPose and integrate it into our project, so we can use cameras to track facial keypoints and body keypoints (such as arms, hands). I have also started working on the software foundation for our project, creating the main render loop for our application.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I am a little behind. I will use some slack time over spring break to work on the gesture algorithm. Facial keypoint tracking + body tracking is essentially done, as it is provided with the C++ API.

What deliverables do you hope to complete in the next week?

(1) Get project running on the Jetson (as opposed to my laptop) to make sure everything works with the hardware. (2) Research gesture recognition algorithms using body keypoints, trying to find libraries that do this or come up with an algorithm of my own.

Steven’s Status Report for Feb 15

What did you personally accomplish this week on the project?

I finally got the Jetson and I spent some time flashing the board with new firmware so we can use it for our project. However, I was a little busy this week, so I mainly worked on figuring out the design details of my portion of the project. Here is the planned system diagram of how I’m going to implement gesture control + eye  tracking.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I am a little behind, and I plan to be even more behind due to a few midterms next week. Because OpenPose can do keypoint tracking for both the eyes, facial features, joints, I think I can work on both gesture + eye tracking at the same time to speed up development. The schedule was planned with a lot of slack time, so I don’t feel pressured.

What deliverables do you hope to complete in the next week?

Get OpenPose running on the Jetson, and start software foundation for our application.

Steven’s Status Report for Feb 8

What did you personally accomplish this week on the project?

This week I worked on finding libraries for eye-tracking and gesture recognition (~4 hrs), and starting on the eye-tracking implementation. One library I found that could do both of these tasks is OpenPose , which in addition could be built on a Jetson. Because we don’t have most of our parts yet, I started developing the code locally. A lot of time (~5 hrs) was spent on trying to get OpenPose to build locally (no prebuilt binaries for unix/linux 🙁 ) ; a lot of dependencies (ex. Caffe) didn’t build on an M1 Mac, so I had to switch to using my x86 linux machine. I was able to get a demo program tracking facial and body keypoints running using the CUDA backend at ~5 fps. More work will have to be done optimizing latency/fps.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress is a little behind, because we don’t have the Jetson Nano yet. Until we get a Nano, one possible route would be to use something like a Docker container to do development on. However, with the current scope of the project I think it is fine to continue development natively on Linux.

What deliverables do you hope to complete in the next week?

One goal is that I hope to build OpenPose on the Jetson and have at least the demo program running. Another goal is to figure out the Python API for OpenPose so I extract keypoints for the eyes and implement the gesture recognition algorithm.