Weekly Status Reports

Neeti’s Status Report for 04/18

Neeti’s Status Report for 04/18

This week I spent a lot of time working more on the classifier. Throughout the week we received hundreds of images from our posts on various social media platforms requesting images as well as from family and friends. We have now collected over 1200 images 

Gauri’s Status Report for 04/18

Gauri’s Status Report for 04/18

This week I worked on many things – getting frames from the RPi camera video stream to pass to the gesture classifier, collecting more images (through crowdsourcing on social media which was a very interesting experience – we have a total of around 1200 real 

Shrutika’s Status Report for 04/18

Shrutika’s Status Report for 04/18

I have been playing with the microphones this week.

Turns out that macOS Catalina doesn’t support Audacity, and then I tried to connect to an old laptop but the USB ports were broken, and had some other adaptor issues… also it’s super unclear if the mics are directional or omnidirectional (the manual says both somehow).

I gave up on Audacity for now (might try to go back to it later if I do end up needing baffles) but currently am trying to figure out how to measure amplitude in pyaudio. I was able to plot the wav files in Matlab and it seems like the mics are picking up pretty different amplitude of sounds without even adding on baffles, just from being turned in different directions. If once I figure out how to compare volumes with pyaudio and start testing they’re not accurate enough, then I’ll try to play with graphing/testing baffles more.

Our hand gesture detector is getting there – we’ve taken tons of pictures of people’s hands this week, and the latest check when Neeti ran it, we were at 90.36% accuracy!

Team Status Update for 04/11

Team Status Update for 04/11

This week was productive – as usual, everything took longer than expected, but we have been making significant progress. Neeti and Gauri spent a lot of time this week on our gesture classifier. We realized that it’s more difficult to recognize gestures with the same 

Neeti’s Status Report for 04/11

Neeti’s Status Report for 04/11

This week we spent a lot of time getting our manual mode ready for the demo! This meant that I was primarily working on the hand gesture classifier, Shrutika was working on animation, and Gauri was working on planning and integration. I spent Monday (04/06) 

Shrutika’s Status Report for 04/11

Shrutika’s Status Report for 04/11

Minor issues for manual mode so it took longer than expected, but still making significant progress.

This week we decided to switch to just using pygame because it was much simpler to integrate, I already knew how to use it, and Matlab was causing a lot of issues. The pygame animation doesn’t look super aesthetic right now, but it’s functional and I plan to make it visually better later, when our sensors and other features are working.

We have made several changes to our gesture classifier but it is working (mostly), and we’re working on collecting a better dataset of gesture images now from our friends and families. More on this in our team update.

Since I have all of the hardware, for the past 2 days I have been working on setting up the pis on my home network, and getting them set up with the camera. We are able to capture video using the pi/pi camera, and now we just need to connect and feed those images to our neural net.

I just started connecting the microphones – TBD on how that works..

Gauri’s Status Report for 04/11

Gauri’s Status Report for 04/11

This week we spent time on the gesture classifier and animation for manual mode.  We met multiple times to work remotely.  I worked with Neeti on debugging, training and testing the gesture classifier with different online datasets.  We discovered that the Kaggle dataset had a 

04/08

04/08

Demo-ed current status on manual mode to Professor Sullivan and Jens Integrated animation with the gesture classifier Spent time yesterday and today morning trying to make classifier work on real images Attempted to train other_classifier.py on an augmented dataset (kaggle + real images) – did 

04/06

04/06

  • Decided to use Python pygame instead of Simulink/Matlab for the animation (why we didn’t think of this before is a mystery) – much easier and simple to integrate as well
  • Testing gesture classifier on real images
  • Goal is to have the pipeline ready by Wednesday for demo: get real image -> feed to classifier ->  take output of classifier -> convert to a direction -> rotate the animation on pygame
  • This can be run on the RPi later (after Wednesday when we try to integrate with the actual control loop) for final manual mode.  Might need some additional testing/code to get images with camera on the RPi.
Team Status Update for 04/04

Team Status Update for 04/04

This week we made a lot of progress on our manual mode – hopefully enough to have it semi-working by demo Wednesday! Neeti has been working on the neural net classifier for the hand gesture (thumbs up and thumbs down). She realized earlier this week