Anushka’s Status Report for 4/30

This week, I mainly focused on different hologram designs. We have three options right now:

  1. The angles in the pyramid are 53-53-127-127, and the dimensions are for a hologram that fits an iPad.
  2. The angles in the pyramid are 53-53-127-127, and the dimensions are for a hologram that fits a monitor.
  3. The angles in the pyramid are 60-60-120-120, and the dimensions are for a hologram that fits an iPad.

So far, we have tried all three, and the last one works the best but only in total darkness. Next week, we’re going to try to test new angles before the demo day, then probably pick the one with the greatest visual effect. I’ve also been working on the final sewing of the device onto the wrist band. We adjusted the height of the sensors since our hand was interfering with the input. I have also been working on the poster for our final demo.

This is the final official status report of the semester. We have all the parts completed sufficiently, but we know there are always areas of improvements. We’re going to try to make minor changes throughout the week before demo day, but otherwise, we have a completed product! I am most concerned about how it will go overall. We have tested our product amongst ourselves, but we are aiming to test it on several participants before demo day and final paper. We are excited to show everyone our progress and what we’ve learned!

Anushka’s Status Report for 4/23

This week was a tough week as I got COVID and had to participate in everything remotely. Since I was incapacitated, I mainly worked on the final presentation.

The main part of the presentation I’m working on is the tradeoffs. We made a lot of algorithmic tradeoffs, so I had to revisit all the old models: the pre-ML model, the SVM model with old arm data, and with the time series classification models. Other things I wanted to mention is the usage of Jetson and different angles of holograms. I designed new hologram sides with slots and holes so that we don’t have to glue the sides together and the sides are more secure.

Since we have to finish the presentation tomorrow, I am still on track to finish this week’s goals. In the next week, I hope to be able to go back in-person and work on the hologram display and reassemble Tony to the exercise band.

Anushka’s Status Report for 4/16

This week was a grind week for the team. We tried many different algorithms for determining what gesture is occurring. There are two things I tried this week:

  1. We queue 5 data points of what fingers are currently being detected. From there, we determine what gesture is occurring. If there are at least 4 occurrences of the number of fingers, then the gesture will be classified as such (0 for noise/rest, 1 for swipes, and 2 for pinches). If the number of fingers goes from 1 to 2, then the gesture is a pinch out, and if the vice versa occurs, then the gesture is pinch in. I set up a few cases that took care of the noise, but ultimately, it still caused a lot of errors. The frequency of data collection is too low; a gesture can range from 2-10 gestures. However, the number of fingers detected before or after the gesture can affect the window of 5 and yield an inaccurate result.

2. I tried using time series classification library called sktime: https://sktime.org/. This library is really cool because I can give the model a set of inputs over time and across different classifications, and the model can predict what class an time series data belongs too. Splitting the new arm data into training and testing, I was able to create a model with 92% accuracy, and this model is able to distinguish between pinch in, pinch out, rest, swipe left, and swipe right. However, this model would need 24 data points in advance, and as discussed before, there aren’t that many data points associated with a gesture, and with this model, we would have to predict the gesture a considerable amount of time before performing the gesture.

As a team, we’ve decided to work on different implementations until Sunday, then make a final decision on what algorithm we should select. It comes down to a tradeoff between accuracy and latency, both important metrics for our device.

I also helped Joanne with building a bigger hologram pyramid this weekend. We decided to hold off on casing until we determine the final height of the pyramid, which we’ve also decided to do a bit more exploring with.

Currently, I am a bit behind on schedule, but whatever tasks that are remaining in development and testing has to be accomplished this week as our final presentation is due next week. Everything essentially has to be done, so I’m in a bit of panic, but I have faith in Tony, Holly, and our team.

Anushka’s Status Report for 4/10

This week, we had our interim demos. I worked on updating our schedule, which can be seen in our Interim Demo post. I was also in charge of making sure that we had all the parts together for the demo.

The night before demos, I attempted to work with the Jetson Nano again. I was able to connect to the Internet, but I was having a tough time downloading Python packages in order to train the machine learning model. I thought it might have been related to location, so I tested the connection in the UC with Edward. However, I still wasn’t able to download packages, even though I was connected. I will be speaking with Professor Mukherjee next week to see if he is able to help.

The team decided to take a small break for Carnival, but we will get back to working on integration and the machine learning model next week. We created a state diagram of how our gesture detection algorithm should work after detecting how many fingers are present. Once the gesture is determined, we will use metrics such as minimum or maximum to decide how much of the gesture is being performed. We have the foundations for both, but we have more information on the latter than the former.

State Diagram of Gesture Detection Algorithm. Swipes refer to the finger detection algorithm returning 1 finger, pinches returning 2, and none returning 0 or undetermined

We are currently on schedule, but since it’s the final few weeks of building, I want to push myself to finish our project by the end of next week and begin final testing. We are hoping to finish the gesture detection algorithm by Tuesday so that we can make incremental improvements for the remainder of our Capstone time.

Anushka’s Status Report for 4/2

This week, we worked more on the gesture recognition algorithm. We figured it would be best to go back to the basics and figure out a more concrete finger detection algorithm, then develop a gesture algorithm on top of that.

Currently, we abandoned the finger detection algorithm in favor of tracking either mins or maxes in the signal, then determining the relationship between them and give a metric to Unity to perform a translation. However, this metric is highly inaccurate. Edward suggested using SVMs for finger detection. The difference between pinches and swipes are the number of fingers present, so we can use existing data to train the model so that it can tell us which sensor represents one, two, or no fingers.

I added some new data that is more comprehensive to the existing data set. I also added some noise so that the model would also know what to classify if there is too many distractions.

Afterwards, I trained the data using different subsets of the new data combined with the old data. The reason behind this was because training the new data took a lot of time. It took over 4 hours to train 200 lines of new data and a lot of power.

Next week, I’m going to train the model on the Jetson with all the data. Jetsons are known for high machine learning capabilities, so maybe using it will make our computation go faster. We managed to add wifi to the Jetson this week, so it’ll be easy to download our existing code from Github. I am concerned about training the new model with the new data. With only the old data, we have a 95% accuracy, but hopefully with the new data, we’ll be prepared for more wild circumstances.

I think I’m personally back on schedule. I want to revisit the hologram soon so that we can complete integration next week. I’m excited to hear feedback from interim demo day since we have time for improvement and will likely add or improve parts of our project after Monday.

Anushka’s Status Report for 3/26

This week was a cool week. I learned a lot from the ethics lesson on Wednesday. After the discussion with the other teams, I learned that the biggest concern for most engineering projects is data collection and data usage, which is something that may be lacking in our project explanation. I will keep this in mind for our final presentation.

I spent a lot of time improving the gesture recognition algorithm. With one single data collection, we are most likely not able to identify what gesture is being done. I improved it by looking at the frequency of the gestures guessed over n data collections. The accuracy improved for every gesture except zoom out, which makes sense because the beginning of the gesture looks like a swipe.

We collected more data so that we can see if the algorithm fits different variations of the the gesture. We noticed that there is high variability in the accuracy of our algorithm based on the speed and location in which we move our fingers. I decided to look into varying the n data collections and the polynomial that the data is currently being fitted into to accommodate our two discoveries. I am working with the data in Excel and am planning on looking at statistics to determine which combination yields the highest result.

Screenshot of pinching in data with n data against the order of the polynomial

I think that although this is a slow week, I’m going to be working hard to improve the algorithm before the interim demo day. Although our project is in the process of being integrated, this is a critical part of achieving our design metrics. I’m planning on meeting with my team outside of class on Monday and Tuesday so that we can work together to improve the algorithms.

Apart from algorithm updates, I need to talk to a professor about the Jetson. I’ve started playing with the Raspberry Pi, which is easier to work with since I have prior experience. I will spend Wednesday working on this part. Next week will surely be busy, but these deliverables are critical to the success of the interim demo.

Anushka’s Status Report for 3/19

This week was a productive week for our capstone project. Over Spring Break, I began working on building the gesture recognition algorithms. That week, we generated some sample data from a few gestures, so I observed the general behavior of the data points. I noticed that on an x-y plane, if the y-axis is where the sensors are lined up and if x-axis is the distance that the sensors are measuring, then a finger creates a sideways U-shape as seen in the picture below. If there are two fingers present, then the U-shape intersects to create a sideways w-shape.

Figure 1: Example of Swiping Data

I began looking at what properties of this graph stayed consistent. With both shapes, I wanted to focus on the local minima of the curves as a method of finger identification. However, the local minima would sometimes be inaccurately calculated or the values were arranged in an increasing or decreasing manner such that no local minima was detected. This was especially the case when two fingers were present. However, Edward pointed out that even though local minima was hard to detect with two fingers present, a local maxima was always present as the two Us intersected. An example is shown in the image below.

Figure 2: Example of Pinching Data

After observations from a few examples of pinching data, I reasoned that the velocity of that maxima could also serve as an indicator what direction the user is pinching and how fast their fingers are moving. If there are no local maximas present, then we can guess that a gesture is a swipe, calculate local minima’s velocity, and determine what direction their finger is moving and how fast they are swiping. We have yet to code the specifics of each gesture, but we are confident about this method of identification and will most likely move forward with this.

I also spoke with Professor Savvides about machine learning methods that would serve as viable solutions. He suggested using Markov methods, but that was something I was unfamiliar with. We were also recommended by Professor Sullivan to use curve fitting, which is why you see a few polynomials fitted to the graphs above. We are going to look into that method over the next week, but at least we have a starting point as opposed to before.

Because of the work put into gesture recognition, I would say we are back on track. I put the Jetson Nano aside for a week because I couldn’t get it to work on a monitor again, so that will definitely be my first task.I might reach out to a professor if I’m still struggling with the Jetson by Monday because that will be my main deliverable next week. I see us bringing all the parts together very soon, and it’s something I’m definitely looking forward to.

Anushka’s Status Report for 3/5

This week was definitely a busy week. Our goal was to make as much progress before Spring Break, as we aren’t able to work on the hardware side of our project over break. I began this week by going over what was needed for the design review document. I made notes on what I wanted to cover in the presentation, which we ultimately incorporated into the final paper.

I began working on the Jetson Nano again. I had a feeling that something was wrong with how I wrote the SD card, so I requested one from the professors on Wednesday to try. Once I rewrote, we tried connecting the Nano to the monitor but with no success. Edward suggested I try using minicom as an interface to see the Nano, and we were both able to successfully install and run the package, thus finally giving as an SSH-like way of accessing the Nano.

I added details in the design report, which included the gesture recognition algorithm and the hologram pyramid design choices. I know there were more things that I wanted to do with the report, especially in the gesture recognition side, and after we get our feedback, I plan on incorporating them into the paper along with my changes. This is more so for us so that we understand the reasoning behind our choices so that if they don’t work, we can eliminate options.

I feel like this is the first week I feel behind. With the sensors coming in on Monday, most of us leaving on Friday, and paper also due on Friday, I felt like there was not a lot of time to test the sensors and gauge whether our current algorithm works. The team talked to the professor about ideas on gesture recognition, and we were suggested machine learning to a) help identify zooming actions, and b) help with user calibration in the beginning. I’m not too familiar with models that can test given a stream of data, so I plan on talking to Professor Savvides about potential options. I would say the gesture algorithm is the biggest risk as this point as if it doesn’t work, we have to determine a machine learning model and train on a huge dataset that we would have to produce ourselves. I think this will be my main focus over the next two weeks so that we can test as soon as we come back from Spring Break.

Anushka’s Status Report for 2/26

This week was heavily design focus. I was responsible for giving the Design Review presentation, so I spent most of last weekend researching and getting the presentation together. It was very helpful doing this presentation because it helped show me what parts of the project we have more planned than others. For example, the “middleware” part of the project, i.e. the Jetson and the communication part, is where we have the most questions surrounding the specs and how/if it’ll fit in the final scope of our project. However, we have the most information on the sensors, which we have already ordered. I have a clearer understanding of the direction of our capstone and what areas I need to explore more.

Speaking of areas I need to learn more about, I continued tinkering with the Jetson this week. It’s still behaving very spottedly, and I predict it has something to do with how I downloaded the image on the SD card. As mentioned before, this is the weakest part of our project, so I need to spend more time on next week. My goal before Spring Break is to get it set up, collect data from the sensors, and send information to the web application. It’s a lot, but if I make this my sole focus, it’ll be good for the project as a whole after we come back.

We ordered individual sensors to test out while we wait for the PCB board to be manufactured. We began testing it and decided to create a visual interface of it. The idea that I had was that the yellow dots represent the sensors and the blue dot represents the objects that are being detected.

I originally wanted to create this web application in Django, so I told Edward to code the MQTT communication in Python. However, since the CSS of the “objects” need to be changed, and the only time it would update is when the server would refresh, we decided to change our web application to be all on the client side. I asked Edward to change the code into JavaScript, and then we were able to get an almost instantaneous rendering of the sensor data. A demo is attached to the link below.

https://drive.google.com/file/d/1w0-5nBngHThfDe-A_Iem-NNt1k3NcuCe/view?usp=sharing

For the next week, I will focus on the Jetson and finishing the design review paper. If I have some time after these two, I will finetune our visualization so that it can be used as a dashboard to measure the latency and accuracy of the sensors. However, this may become a Spring Break or after Spring Break issue.

As mentioned before, the Jetson is the weakest part and the greatest risk. Although we have a contingency plan if the Jetson doesn’t work, I don’t see a reason why we have to remove it at the moment. I will reach out to my teammates and the professors on Monday so that we can get over this hump. We’re still pretty much on schedule, since we are getting a lot of work done before the actual sensors get shipped.

Anushka’s Status Report for February 19

This week, my focus was on a myriad of tasks rather than focusing on a specific dimension. This was prompted by our earlier purvey into the hologram pyramids; even though we aren’t scheduled to work on it until much later, I wanted to look into web application interface and the Jetson communication.

With a quick look into the holograms last week, after discussions with the professor and the TA, we are considering making the hologram pyramid bigger so that there is more surface area. I had an idea of cutting square pieces of acrylic, decreasing the angles of the trapezoid, and testing.

Looking into the Pepper’s Ghost illusion, the original use of it was in magic show’s. The original projections are similar to the ones in the diagram shown below. That projection is horizontal, and there is only one glass available. I may brainstorm ideas on how to execute this, since most people have access to projectors and can easily replicate this instead of needing to use a big screen and a pyramid.

​​

Image from Wikipedia: https://en.wikipedia.org/wiki/Pepper%27s_ghost#/media/File:Peppers_ghost_low_angle.jpg

I also began looking into Jetson. Setting it up is similar to setting a Raspberry Pi, but I was having some trouble formatting the SD card. I also began researching different communication protocols, including MQTT. I want to follow this tutorial once I set up the Jetson to see if we can use it in our final project: https://www.youtube.com/watch?v=WPmzoYwXj00&ab_channel=AshutoshMohanty.

Image of Jetson

We have a design presentation due next week, and I volunteered to present this time. I want to focus on making our presentation more pitch-focused, so I began storyboarding how I imagine the user will be using it. Some screenshots are below. For the presentation, I also worked on a few slides on the wrist watch CAD file and the gesture algorithm, which can be seen under the Design Review tab above.

Screenshots of the user experience

Since we are moving around the different parts of the schedule, I don’t believe I am behind. If anything, starting on these parts means that we are ahead, but I have to continue working on them in order to continue to hold this position. Over the next week, I want to do more work on the Jetson, since this will be the most relevant once the sensors are completed. Since we ordered some sensors to test with, we can start with communication at the end of next week with the Jetson, so I’ll make that my deadline.