Anushka’s Status Report for 4/10

This week, we had our interim demos. I worked on updating our schedule, which can be seen in our Interim Demo post. I was also in charge of making sure that we had all the parts together for the demo.

The night before demos, I attempted to work with the Jetson Nano again. I was able to connect to the Internet, but I was having a tough time downloading Python packages in order to train the machine learning model. I thought it might have been related to location, so I tested the connection in the UC with Edward. However, I still wasn’t able to download packages, even though I was connected. I will be speaking with Professor Mukherjee next week to see if he is able to help.

The team decided to take a small break for Carnival, but we will get back to working on integration and the machine learning model next week. We created a state diagram of how our gesture detection algorithm should work after detecting how many fingers are present. Once the gesture is determined, we will use metrics such as minimum or maximum to decide how much of the gesture is being performed. We have the foundations for both, but we have more information on the latter than the former.

State Diagram of Gesture Detection Algorithm. Swipes refer to the finger detection algorithm returning 1 finger, pinches returning 2, and none returning 0 or undetermined

We are currently on schedule, but since it’s the final few weeks of building, I want to push myself to finish our project by the end of next week and begin final testing. We are hoping to finish the gesture detection algorithm by Tuesday so that we can make incremental improvements for the remainder of our Capstone time.

Edward’s Status Report for April 10

This week, we demo’ed our prototype to the instructor and TA. There was a small soldering mistake that I fixed before our demo, but our hardware should reliably work from now on. An issue we found in our setup is the connectivity. WiFi in the lab is pretty spotty and sometimes there will be pauses when data is being streamed. We hope this will not happen much in the UC when we demo, but I am not sure this will go away. I plan to talk to Tamal about this in the coming week.

Other than the demo, we also worked on hashing out a basic plan for the gesture detection and did further integration with Joanne’s Webapp.

We did not do much coding this week due to Carnival, but next week I will help Anushka code up our new gesture detection algorithm. We plan to have an new approach where if a new gesture is detected for n times, then we say that the current gesture is that new gesture.

We also decided to ditch the Jetson Nano due to poor WiFi performance, but Tamal said this should not be an issue, so I may chat with him about this in the coming week.

Possibly tomorrow, we also plan to attach our hardware to a wristband we bought. This will be done using tape or sewing.

We are on track if we work on and refine the gesture detection for the next two weeks. This is the toughest part of this project and it needs to work well for the project to function.

Interim Demo Schedule

Here is a copy of our most recent schedule:

It can also be found here.

Major tasks that are remaining include:

  • Improving the finger and gesture detection algorithms using the machine learning models we’ve been working with over the past few weeks,
  • Providing the ability to upload 3D models from the web application,
  • Testing our sensors once mounted on wristband,
  • Cutting final version of hologram, and
  • Collecting latency data.

Joanne’s Status Report for 4/2/2022

This week I focused on drafting up the web application by adding UI and other features we need for the project. I added a login/register feature for users and am in the midst of creating a preview of the 3d model.

I refined code for the 3d model rotation. Previously our problem was that noise in data caused the 3d model to spike toward the wrong direction. Therefore I filtered out all of the noise by just ignoring data points that caused those spikes. This caused this 3d model to rotate more smoothly.  I also tested out the zoom in feature using live data from the sensors. The zoom in right now  zooms in at a constant pace. I am working on a fix right now to get the zoom in to work at a pace that matches the user’s input gesture speed. This week I will focus on refining these algorithms more so the translations happen more smoothly even when there is a lot of noise.

I also started helping out with the gesture detection part of the project. We had team meetings to discuss how we can improve the algorithm to increase accuracy in finger detection. We came up with the conclusion of using ML and are in the midst of testing our new algorithm. We are in the midst of integrating all of our components.

Edward’s Status Report for April 2

This week, I did a lot of group work.

I worked with Anushka to set up the Jetson Nano. We had some trouble setting up WiFi, but eventually figured out how to connect to CMU-DEVICE. I plan to set up the MQTT broker on the Jetson in order to improve WiFi latency. Since the broker is closer to the wearable and on the same network, it should be much faster in theory. This weekend, I plan to test this out and see if there is any noticeable speedup in data transmission.

We decided on training a machine learning model to classify the number of fingers on the forearm, since fitting a curve and finding the min and max to classify swipes and pinches was too complicated. So, I worked with Anushka and Joanne to create a SVM classifier to classify the number of fingers. I wrote a training script and trained it on data we had collected a while ago and got about 95% accuracy in cross validation! It seems to perform well on live data, but it isn’t perfect. SVM seems to work well for this task, so we collected some more data and are training it to see if we can get any improvements. The next steps are to improve finger detection given that we know the approximate number of fingers and then improve gesture detection given hat we know the approximate locations of the fingers. I am not too confident that finger detection will be as good as we want, so we may have to deal with that this week when we actually fully implement it. Maybe we need to apply some filtering technique or throw out bad detections.

I also further integrated the wearable with Joanne’s Webapp. The improved finger detection powered by the SVM seemed to be better. Less swipes are classified as pinches which means the Webapp consistently rotates the model.

I am on track for this week, but things are piling up…

Team Status Report for 4/2

This week was a better week for the team. We were able to tackle many problems regarding our capstone.

Our biggest concern was the Jetson. After reading online tutorials on using Edimax, we were able to finally connect the Jetson to WiFi. Next week, we’ll be focusing on connecting the sensors to the Jetson and the Jetson to the Unity web application so that we can accomplish our MVP and start formally testing it.

Since we have time now, we went back to our gesture algorithm and sought to improve the accuracy. We explored different machine learning models and decided to see if Support Vector Machines (SVM) would be effective. On the data that we already collected, the model seems to be performing well (95% accuracy on cross-validation). However, we know that the data is not comprehensive of different locations and speeds of the gestures. We collected more data and plan on training our model on that over the next week. It takes a lot of computational power, so we will try training on either a Jetson or a Shark machine. We were also suggested AWS, which we might also look into.

Biggest risk now is latency. We want to make sure that everything is fast with the addition of the Jetson. We have a Raspberry Pi as a backup, but we’re hoping that this works. Therefore, we haven’t made any changes to our implementation. We also went ahead and ordered the band to mount the sensors on. We plan on collecting more data and training on that next week after we find an effective way to train the model.

We should have a working(ish) demo for the interim demo on Monday. We also seem to be on track, but we will need to further improve our detection algorithms and thinking to have a final, usable product.

Anushka’s Status Report for 4/2

This week, we worked more on the gesture recognition algorithm. We figured it would be best to go back to the basics and figure out a more concrete finger detection algorithm, then develop a gesture algorithm on top of that.

Currently, we abandoned the finger detection algorithm in favor of tracking either mins or maxes in the signal, then determining the relationship between them and give a metric to Unity to perform a translation. However, this metric is highly inaccurate. Edward suggested using SVMs for finger detection. The difference between pinches and swipes are the number of fingers present, so we can use existing data to train the model so that it can tell us which sensor represents one, two, or no fingers.

I added some new data that is more comprehensive to the existing data set. I also added some noise so that the model would also know what to classify if there is too many distractions.

Afterwards, I trained the data using different subsets of the new data combined with the old data. The reason behind this was because training the new data took a lot of time. It took over 4 hours to train 200 lines of new data and a lot of power.

Next week, I’m going to train the model on the Jetson with all the data. Jetsons are known for high machine learning capabilities, so maybe using it will make our computation go faster. We managed to add wifi to the Jetson this week, so it’ll be easy to download our existing code from Github. I am concerned about training the new model with the new data. With only the old data, we have a 95% accuracy, but hopefully with the new data, we’ll be prepared for more wild circumstances.

I think I’m personally back on schedule. I want to revisit the hologram soon so that we can complete integration next week. I’m excited to hear feedback from interim demo day since we have time for improvement and will likely add or improve parts of our project after Monday.