Team Status Report for 4/16/2022

This week, we attached the board and the sensors to a wristband we bought. We collected new data with the board on our arm and retrained our number-of-finger detection model with this new data. The sensors need to be elevated and slanted up to that it doesn’t confuse the skin on the arm as a finger.

We are having a bit of trouble detecting swipes and pinches reliably, so we have been discussing removing pinches entirely from our implementation. Our number of finger classifying SVM model gets confused between swipes and pinches and it causes a lot of problems for gesture detection. We created a demo of a “swipes-only” system and that seems to work much better, since it only has to detect if there is “none” or “one finger.” Anushka and Edward (mostly Anushka) have been trying our various new techniques to detect fingers and or gestures, but we are having trouble finding an optimal one. Early next week, we will discuss what our future plans are for our finger detection.

Anushka and Joanne have cut new acrylic sheets for the hologram and will work on creating an enclosure this week.

We are a bit behind schedule this week as we are working on refining our “swipes-only” system. We may create two modes for our system: one that has the swipes-only model so that we are guaranteed a smooth performance, and one that has both swipes and pinches, which isn’t as smooth but does have more functionalities. We are also considering changing the gesture for the zooms while still using our two finger model, but we have to come to a decision by early next week on if and how we should implement this.

We’ve decided to scrape the Jetson as the latency of communication between the sensors and the Jetson was so much slower than the latency between the sensors and our laptops. We were looking forward to using the Jetson for machine learning, but again due to network issues we’ve been having and the fact that we are able to train the models on our own laptops has led us to this conclusion. We have learned a lot from using it, and we hope that in the future, we can find whatever the issue is behind it.

Joanne’s Status Report for 4/16/2022

This week we worked together a lot on testing our integrated product and fixing individual bugs we had from each of our parts. We mounted our board onto the wristband we bought. We have been testing our algorithm on the new data that reflects user input taken from the arm. Right now we see that there is a lot of noise in the data that causes the model to move in erratic behaviors. Right now I am working on trying to smooth out the noise on my end, while Edward and Anushka also work on filtering out the data on their end as well.

I have created a new rotation algorithm that makes the model rotate at a much smoother pace. When tested on ideal data (data with no noise), it moves at a very smooth consistent rate relative to how fast the user swiped. Before I only had a rough rotation algorithm where the model move based on the distances given to me of the fingers. Now I take into account the timestamps of when the data was taken so I can approximate a relative speed of the user. This change was only for rotations when it was limited to the X axis.

Due to some problems in gesture detection on the sensor side, we are right now planning of getting rid of pinches, since pinching and swiping motion is confusing our ML model. Thus we are thinking of implementing rotation in all degrees to add additional functionality. I have added that functionality in right now. However the translation of the finger locations does not translate intuitively to the model rotation (it rotates in the right direction but not the right angles?). I am working on how to make the swipe made by the user look more like the 3d rotation we would expect to see. We have been talking as a group to see what other functionalities we can include from what we have now and in the time frame left. Some ideas might be detecting taps, or creating a new zoom in scheme that does not involve the traditional pinching motion.

Right now I am also currently working on new ways of getting rid of the unexpected rotation spikes due to noise. I graphed out the data points and decided to try averaging each data point in a swipe and using that as a calculation standard for finger location so that I can try to reduce the effect of noise in the data. I will test that implementation out this week.

Anushka and I also have recut the hologram from plexiglass to fit the dimensions of the display we are going to use for the presentation. We are planning to create the encasing (for light blocking) this week.

Edward’s Status Report for April 16

This week, I continued working on the finger and gesture detection and further integrated the wearable with the Webapp. Our data is extremely noisy, so I have been working on smoothing out the noise.

Originally, the Photon is unable to send the timestamp when it collected all 10 sensor values, since the RTC on the Photon did not have millisecond accuracy. But, I found a way to get an approximate time of sensor data collection by summing up elapsed time in milliseconds. This made the finger detection data sent to the Webapp appear smoother.

I am not super confident in our SVM implementation of finger detection, so I want to remove pinches entirely in order to reduce noise. Two fingers that are close together often get confused with one finger and vice-versa. Because of this fact, many swipes get classified as pinches, which throws off everything. So, I decided to work on an implementation without pinches and it seemed to work much better. As a group, we are deciding whether or not this is the direction to go. If we are to continue with only detecting swipes, we will need to have a full 3D rotation rather than only rotating along the y-axis.

This week, we also attached the board to a wristband we ordered and have been working off of that. The sensors need to be elevated slightly and I am not sure how annoying that will be to a user. This elevation also makes it so that users cane just swipe on their skin casually; they need to deliberately place their finger in a way so that the sensors can catch it.

This week, I will work on trying to filter and reduce noise in our swipes-only implementation, which I think we may stick with for the final project. This will be a tough effort.

I am a bit behind schedule as the project demo comes closer. Detected finger locations are a bit shaky and I’m not sure if I can figure out a way to clean them up. I will work with Joanne to come up with algorithms to reduce noise, but I’m not too confident we can make a really good one. Removing pinches limits the functionality of our device and I’m not sure if its even “impressive” with just swipes. We will need to either explore more gestures or broaden our use case to make up for this fact.

Team Status Report for 4/10/2022

This week was demo week so we worked as a group on integrating our individual setups and testing our prototype. Before the demo we had a small hardware problem that turned out to be a soldering mistake. Edward fixed this before our demo thankfully. We further worked on creating a better plan for our gesture detection algorithm. The new finger detection algorithm using the ML approach provided better accuracy than our before approach. Thus we are hoping that the new gesture detection algorithm we planned out would also provide better results.

We realized that our data was getting some lags between the hardware and the Webapp due to some spotty wifi problems in the lab. Professor Tamal mentioned during the demo that he could help us out with that, so we are planning to talk to him soon. Other than that we also planned out a list of things to add on the web application portion such as battery life, battery status, finger location. We are also planning to implement a new strategy in translating the data we get from gesture detection into smoother model translations.  Because of the large amount of data and noise, we need to make the translations account for that. Thus we have planned out a new approach, and am also planning this week to implement it.

We are planning to mount our board to our wristband that we ordered soon (probably tomorrow). We got feedback from our demo session that we should see as soon as possible what live data taken from our arm would translate to in our model. Thus we are planning to test that as a group in the early week.

We think we are on track and currently refining the parts of our project.

Joanne’s Status Report for 4/10/2022

This week was demo week so we focused in on integrating the parts of our project. We conducted more testing on how the live sensor data translates to our model transformations (rotate, and zoom). We tried to mimic as many gestures from different angles and observed whether or not they performed the correct functionality on the model. We found out there was slight lag between the user input and the model translations due to lag in data being sent due to spotty wifi in the lab. Professor Tamal mentioned that he could help us out with the connectivity issue, so we are planning to talk to him this week about that as well.

Other than demo work, I also started taking in more info from the hardware to also display things such as battery life, on/off status. Once i get the information via MQTT from the hardware, it updates it self on the webapp. I am also planning this week to create a visual layout of where the finger is in relation to our sensor layout.

I thought of a new algorithm to help translate the data better into smoother translations. I could not translate my new approach to code yet due to carnvial week. However I hope this new approach will help in making the model move more smoothly even in the presence of noise and at a rate more consistent with how fast the user is swiping/ pinching.

I believe we are on track, and as a group this week we are planning also to mount our hardware onto a wristband so that we can see what actual data taken from our arm would be like.

 

Anushka’s Status Report for 4/10

This week, we had our interim demos. I worked on updating our schedule, which can be seen in our Interim Demo post. I was also in charge of making sure that we had all the parts together for the demo.

The night before demos, I attempted to work with the Jetson Nano again. I was able to connect to the Internet, but I was having a tough time downloading Python packages in order to train the machine learning model. I thought it might have been related to location, so I tested the connection in the UC with Edward. However, I still wasn’t able to download packages, even though I was connected. I will be speaking with Professor Mukherjee next week to see if he is able to help.

The team decided to take a small break for Carnival, but we will get back to working on integration and the machine learning model next week. We created a state diagram of how our gesture detection algorithm should work after detecting how many fingers are present. Once the gesture is determined, we will use metrics such as minimum or maximum to decide how much of the gesture is being performed. We have the foundations for both, but we have more information on the latter than the former.

State Diagram of Gesture Detection Algorithm. Swipes refer to the finger detection algorithm returning 1 finger, pinches returning 2, and none returning 0 or undetermined

We are currently on schedule, but since it’s the final few weeks of building, I want to push myself to finish our project by the end of next week and begin final testing. We are hoping to finish the gesture detection algorithm by Tuesday so that we can make incremental improvements for the remainder of our Capstone time.

Edward’s Status Report for April 10

This week, we demo’ed our prototype to the instructor and TA. There was a small soldering mistake that I fixed before our demo, but our hardware should reliably work from now on. An issue we found in our setup is the connectivity. WiFi in the lab is pretty spotty and sometimes there will be pauses when data is being streamed. We hope this will not happen much in the UC when we demo, but I am not sure this will go away. I plan to talk to Tamal about this in the coming week.

Other than the demo, we also worked on hashing out a basic plan for the gesture detection and did further integration with Joanne’s Webapp.

We did not do much coding this week due to Carnival, but next week I will help Anushka code up our new gesture detection algorithm. We plan to have an new approach where if a new gesture is detected for n times, then we say that the current gesture is that new gesture.

We also decided to ditch the Jetson Nano due to poor WiFi performance, but Tamal said this should not be an issue, so I may chat with him about this in the coming week.

Possibly tomorrow, we also plan to attach our hardware to a wristband we bought. This will be done using tape or sewing.

We are on track if we work on and refine the gesture detection for the next two weeks. This is the toughest part of this project and it needs to work well for the project to function.

Interim Demo Schedule

Here is a copy of our most recent schedule:

It can also be found here.

Major tasks that are remaining include:

  • Improving the finger and gesture detection algorithms using the machine learning models we’ve been working with over the past few weeks,
  • Providing the ability to upload 3D models from the web application,
  • Testing our sensors once mounted on wristband,
  • Cutting final version of hologram, and
  • Collecting latency data.

Joanne’s Status Report for 4/2/2022

This week I focused on drafting up the web application by adding UI and other features we need for the project. I added a login/register feature for users and am in the midst of creating a preview of the 3d model.

I refined code for the 3d model rotation. Previously our problem was that noise in data caused the 3d model to spike toward the wrong direction. Therefore I filtered out all of the noise by just ignoring data points that caused those spikes. This caused this 3d model to rotate more smoothly.  I also tested out the zoom in feature using live data from the sensors. The zoom in right now  zooms in at a constant pace. I am working on a fix right now to get the zoom in to work at a pace that matches the user’s input gesture speed. This week I will focus on refining these algorithms more so the translations happen more smoothly even when there is a lot of noise.

I also started helping out with the gesture detection part of the project. We had team meetings to discuss how we can improve the algorithm to increase accuracy in finger detection. We came up with the conclusion of using ML and are in the midst of testing our new algorithm. We are in the midst of integrating all of our components.

Edward’s Status Report for April 2

This week, I did a lot of group work.

I worked with Anushka to set up the Jetson Nano. We had some trouble setting up WiFi, but eventually figured out how to connect to CMU-DEVICE. I plan to set up the MQTT broker on the Jetson in order to improve WiFi latency. Since the broker is closer to the wearable and on the same network, it should be much faster in theory. This weekend, I plan to test this out and see if there is any noticeable speedup in data transmission.

We decided on training a machine learning model to classify the number of fingers on the forearm, since fitting a curve and finding the min and max to classify swipes and pinches was too complicated. So, I worked with Anushka and Joanne to create a SVM classifier to classify the number of fingers. I wrote a training script and trained it on data we had collected a while ago and got about 95% accuracy in cross validation! It seems to perform well on live data, but it isn’t perfect. SVM seems to work well for this task, so we collected some more data and are training it to see if we can get any improvements. The next steps are to improve finger detection given that we know the approximate number of fingers and then improve gesture detection given hat we know the approximate locations of the fingers. I am not too confident that finger detection will be as good as we want, so we may have to deal with that this week when we actually fully implement it. Maybe we need to apply some filtering technique or throw out bad detections.

I also further integrated the wearable with Joanne’s Webapp. The improved finger detection powered by the SVM seemed to be better. Less swipes are classified as pinches which means the Webapp consistently rotates the model.

I am on track for this week, but things are piling up…