Sumayya’s Status Report for 2/24

Progress Update:

This week I made progress on gesture recognition using MediaPipe. I had already tested MediaPipe using the web browser demo, but this past week I worked on writing a python script to make the Gesture Recognition model work with my laptop camera. The script was able to recognize all the gestures that MediaPipe was originally trained for.

 

https://drive.google.com/file/d/1Xvm71s50BpO0O9d-hPm9-XWkQNBlrgQR/view?usp=share_link

Above is the link to a video demonstrating MediaPipe on my laptop. Angle of gesture is very important (notice how thumbs down was hard to recognize due to poor wrist position/angle).

The following are the gestures we decided we will need through out the program:

  • Open Palm (right hand)
  • Open Palm (left hand)
  • Swipe (left to right)
  • Swipe (right to left)

The first two gestures are already trained for in the model. For the Swipe gestures, I learned how to access the 21 hand landmarks and their properties such as the x, y, and z coordinates. This had originally proved to be difficult because the documentation was not easily accessible. Since a swipe is a translation on the x axis, I plan to simply calculate the difference in the x-coordinate over a set of frames to determine a Swipe.

 

On the left you can see the x, y, z coordinates of each of the 21 landmarks for each frame in the video.

https://drive.google.com/file/d/15Q_YZcS0Vv8EEd6kOf7mQT8irsR37j3Y/view?usp=share_link

Above video shows what the Swipe Gesture looks like from right to left.

Schedule Status: 

I am on track with my gesture recognition and tracking schedule. But I am behind with flashing the AGX as I still have not been able to get an external PC. There have been slow communications with Cylab. I plan to talk to the professor next week and find a quick solution. But I am not too concerned at the moment, as much of my testing with MediaPipe can be done on my laptop.

Next Week Plans:

  • Complete Swipe Gesture Recognition
  • Research algorithms for object tracking
  • Start implementing at least one algorithm for object tracking
  • Get a PC for flashing AGX and flash the AGX

 

 

Sumayya’s Status Report for 2/17

Progress Update:

This week I spent many hours attempting to flash the Xavier AGX. After trying multiple installation methods, I learned that it is extremely difficult to flash Nvidia products on an M1 chip computer as it has an ARM64 architecture rather than the required AMD64 architecture. I attempted to flash on both my teammates computers but this was also proving difficult. I opted to reach out to Professor Marios for help and was fortunately able to acquire a spare PC.

Intel Chip Macbook unable to recognize the AGX

Additionally, I also tried to use OpenPose and MediaPipe. Installing OpenPose had similar issues on my computer but MediaPipe was very easy to use on the web.  I was able to test some gestures on MediaPipe using the online demos and found it to be fairly robust. I plan to test the same gestures on OpenPose once I have it installed on the new PC so I can compare its performance against MediaPose.

MediaPipe Recognizes “Thumbs-Up” gesture

I am currently working on the python script to run the gesture recognition algorithm to use with my computer camera.

Schedule Status: On track!

Next Week Plans:

  • Have a running python script with camera streaming from laptop
  • Have the same python script running on the AGX with the Arducam
  • Flash Jetson on new PC

Team Status Report for 2/10

A significant risk we are currently considering is the mount for our hardware components. We need a large projector with strong light projection (similar to a projector used in classrooms) for our application similar to . The projector combined with the Xavier AGX will heavy components in our product that need to be secured to a ceiling. As a result, we will be putting extra efforts into designing and building a bracket and mount to secure the projector, AGX and camera module. We plan to work with faculty and peers to get advice on design specifications and will be using CAD software to design the bracket and mount. If the overall device cannot be mounted to the ceiling, we will create another structure that can hold the weight of the device. This may include a mount to a different wall.

There have been no changes to the existing design, and no update to schedule.

Sumayya’s Status Report for 2/10

I researched the multiple libraries available for gesture tracking this week. In particular, I weighed the pros and cons of OpenPose vs MediaPipe. Here is a table discussing the differences:

 

At the moment, we have decided to use OpenPose since we have the necessary processing power. Regardless, I plan to complete preliminary testing using both OpenPose and MediaPipe to judge how well each library recognizes gestures.

I was able to acquire the Xavier AGX and Arducam Camera module from the inventory and plan to start working with them this week.

I also spent a couple hours working with my team on creating material for the Proposal Presentation.

For next week I will:

  • Use Arducam camera module with AGX
    • Install necessary drivers
    • Be able to get a live feed
  • Test OpenPose and MediaPipe for accuracy
    • Start with basic gestures in front of camera
    • Transition to tests with hand on flat surface, camera facing down

Progress is on schedule.