Month: April 2020

Shrutika’s Status Report for 04/11

Shrutika’s Status Report for 04/11

Minor issues for manual mode so it took longer than expected, but still making significant progress. This week we decided to switch to just using pygame because it was much simpler to integrate, I already knew how to use it, and Matlab was causing a 

Gauri’s Status Report for 04/11

Gauri’s Status Report for 04/11

This week we spent time on the gesture classifier and animation for manual mode.  We met multiple times to work remotely.  I worked with Neeti on debugging, training and testing the gesture classifier with different online datasets.  We discovered that the Kaggle dataset had a 

04/08

04/08

  • Demo-ed current status on manual mode to Professor Sullivan and Jens
  • Integrated animation with the gesture classifier
  • Spent time yesterday and today morning trying to make classifier work on real images
  • Attempted to train other_classifier.py on an augmented dataset (kaggle + real images) – did not work
  • So far unable to classify real images correctly.  Classifier always predicts the same class for both gestures.
  • Feeding in any Kaggle image works correctly
  • Trying to build up our own dataset now by asking for images from people
    • Will use this on the existing classifier
    • If that doesn’t work, will try other tutorial (the one for a real, manufactured dataset)
  • Goals:
    • By Friday night, have the classifier working, be able to process a video stream to get frames to feed into the classifier.
    • By Sunday night, have manual mode fully done and out of the way
  • Ethics assignment due Sunday, ethics discussion on Monday
04/06

04/06

Decided to use Python pygame instead of Simulink/Matlab for the animation (why we didn’t think of this before is a mystery) – much easier and simple to integrate as well Testing gesture classifier on real images Goal is to have the pipeline ready by Wednesday 

Team Status Update for 04/04

Team Status Update for 04/04

This week we made a lot of progress on our manual mode – hopefully enough to have it semi-working by demo Wednesday! Neeti has been working on the neural net classifier for the hand gesture (thumbs up and thumbs down). She realized earlier this week 

Gauri’s Status Report for 04/04

Gauri’s Status Report for 04/04

This week was too busy for me personally due to midterms, assignments and moving to another apartment on campus, so I didn’t get to spend as much time as I would have liked to on the control loop.  We met a couple times this week for quick catch up meetings.  I started sketching out a rough design for it and tried to plan out how it would fit together with Shrutika’s Simulink work and Neeti’s gesture classifier.  We are considering different options on how to convey the gesture received on the Pi to Simulink on the laptop.  Of course for proof of concept we could simply get the gesture and manually feed it into Simulink but it would be nice if we could directly route it to a listening script or something.  Also, I found out that it is possible to save the ML models to json files so that we don’t need to retrain every time (saves a lot of compute and time).

Tomorrow we will be working for some time to integrate the parts for manual mode demo on Wednesday.  We still hope to have that working, given that we have made good progress this week.

Neeti’s Status Report for 04/04

Neeti’s Status Report for 04/04

This week we all worked individually on our predetermined parts rather than meeting to discuss decisions. On Monday, we met briefly to discuss our progress on our respective parts. I spent the majority of this week working on the neural net classifier for the hand 

Shrutika’s Status Report for 04/04

Shrutika’s Status Report for 04/04

This week we made a lot of progress on manual mode – Neeti was working on the gesture classifier and training the models, and I worked on creating the simulation and figuring out how we can hook it up to output from the sensors. Matlab