Tag: Neeti’s status report

Neeti’s Status Report for 05/02

Neeti’s Status Report for 05/02

Last status report! I’m actually going to miss working on capstone 🙂 This week I worked on putting final touches on the animation such as making changes that we talked about last week and adding functionality for mode switches, text, arrows and more clarity to 

Neeti’s Status Report for 04/25

Neeti’s Status Report for 04/25

This week I worked on the animation that will be the output for our final project. The animation will represent how the actual device would have worked if the circumstances had allowed us to work with the motors, motor hats and 3D printers etc. Working 

Neeti’s Status Report for 04/18

Neeti’s Status Report for 04/18

This week I spent a lot of time working more on the classifier.

Throughout the week we received hundreds of images from our posts on various social media platforms requesting images as well as from family and friends. We have now collected over 1200 images and continue to get more everyday. The images are of four kinds – the ok symbol for right, the L shape for left, and both these symbols in front of the person’s face. I spent a lot of time collecting, labeling and organizing these images. When I tested the classifier with these images (no matter the number of images) I noticed that the classifier when run with just the images of the hands (no heads) had a consistently higher accuracy than when I ran the classifier with all the images. Thus, we made the decision to run the classifieron and henceforth collect data of only hands in front of a background. The following has been the progress of the classifier:

200 images -> 0.54 accuracy

300 images -> 0.75 accuracy

500 images -> 0.80 accuracy with 50 epochs and 10% test and train split

700 images -> 0.83 accuracy

1000 images -> 0.87 accuracy with binarization, skin detection algorithm and parameter tweaking

Neeti’s Status Report for 04/11

Neeti’s Status Report for 04/11

This week we spent a lot of time getting our manual mode ready for the demo! This meant that I was primarily working on the hand gesture classifier, Shrutika was working on animation, and Gauri was working on planning and integration. I spent Monday (04/06) 

Neeti’s Status Report for 04/04

Neeti’s Status Report for 04/04

This week we all worked individually on our predetermined parts rather than meeting to discuss decisions. On Monday, we met briefly to discuss our progress on our respective parts. I spent the majority of this week working on the neural net classifier for the hand 

Neeti’s Status Report for 03/28

Neeti’s Status Report for 03/28

This week we met on Monday (03/23) to discuss what to do about the microphones and we finally ordered four of the cheap adafruit microphones as only one of our devices will display automatic mode. I worked on downloading the gesture dataset.

We met on Wednesday (03/25) with Jens and professor Sullivan to discuss our plans, progress, and Statement of Work document.

We met on Friday (03/27) to work together! I have been working on downloading openCV and constructing the image corpus to feed into the classification model. I hope to have the classifier done by next week!

Neeti’s Status Report for 03/21

Neeti’s Status Report for 03/21

This week we met on Monday (03/16) to float around ideas for how to transition our project to accommodate the new remote setup such as using an animation instead of a physical platform and motor as the output of our control loop. We then briefly 

Neeti’s Status Report for 02/29

Neeti’s Status Report for 02/29

This week I worked on making a lot of the design choices for our project as well as the slides for the design review presention. I spent last weekend researching the different parts available to us and the pros and cons of each of these 

Neeti’s Status Report for 02/22

Neeti’s Status Report for 02/22

This week I worked on doing research for the design review while Gauri and Shrutika figured out how to ssh into the Pis. This was due to the fact that there are only two Pis and I did not have a USB to USB-c cable which I will need in the future to be able to connect my laptop to the Pis. However, I was able to flesh out the details of how the Pis will interact with the motor. I did some research on different motors – stepper, DC, gear, etc. and the different tradeoffs of each. We finally decided on stepper motors for their precision, and high-torque, low-speed capabilities. I realized we needed a motor controller as the GPIO pins do not have enough power to drive the motor directly and this would fry the pins. I also looked into libraries we could use to interface with the motor controller through GPIO pins and came across the rpio.gpio, rpi.gpio and servoblaster libraries that allow us to directly interact with the motor controller through the pi and have either software or hardware  Ultimately, I suggested we use a motor controller hat as it provides very convenient libraries for motor control, sits on the Pi itself and also has motor control built-in.

I also spent some time looking at this project that is very similar to ours in terms of requirements:

Pan/tilt face tracking with a Raspberry Pi and OpenCV

Additionally, we discussed how we would interface between the microphones and the Pi as well as the camera and the Pi. We were able to narrow down the specific numbers and models for the microphones as well as the camera. We also talked about some of the software requirements for the project such as a hand/not-hand classifier for our manual mode and how we would organize the code for continuous sensor input and triggering motor rotation.

In the upcoming week, we will finalize all our design decisions and complete the design review report. I also hope to order the motor controller hat and motor and get started on Pi-motor communication!

Neeti’s Status Report for 02/15

Neeti’s Status Report for 02/15

First weekly status report! We met twice this week during mandatory lab time as well as one additional meeting yesterday. This week we researched and ordered various parts. We were able to hone in on the specific parts we intend to use for the project.