Somya’s Status Report for 4/27/2024

This past week I gave the final presentation, so a good portion of my time was spent working on the slides and preparing for the talk itself. In addition, I’ve finished the haptic unit tests and have made the edits to Ricky’s Bluetooth code. Tomorrow when we are done we will verify that the haptic pulses are synchronized with real-time sign detection when Ricky updates his code with my edits. We decided to implement the haptic feedback this way, because it wouldn’t have made sense for me to download all the ML packages to test this feature that is ideologically separate from the ML detection. This decision saved us time which is good. One issue I had to debug was that when sending a byte signal from the computer, the byte would send over and over again (as verified by debug statements in the Arduino code) and the pulse would happen repeatedly instead of just once. To fix this I had to send an ACK back from the Arduino to signal to the Python code to stop signaling. 

 

My progress is on schedule. 

 

This week, I hope to hash out the script/layout of the final video as well as complete my action items for the final report. We also plan on meeting tomorrow to do a final full system test now that all parts of our final product deliverable are in place.



Somya’s Status Report for 4/20/2024

This week I got the haptic feedback to work. Now, I am able to send a signal via bluetooth from a Python script to the Arduino at the start and end of calibration, as well as when a signed word/letter has been spoken out. The Arduino is able to case on the type of signal it receives and produces an apt haptic feedback pulse. My next steps would be to measure latency, experiment with which type of signal is most user-friendly, and see how far away the sender and receiver have to be in order for this mechanism to work in a timely manner. 

My progress is on schedule. 

This week I am giving the final presentation, so my main goal is focusing on that. In addition, we have our poster and final paper deliverable deadlines quickly approaching, so I plan to dedicate a significant amount of time to that. I also want to help with the overall look of what we will showcase on demo day, as one of our focus points from really early on was try to make sure our gloves are as user-friendly and not clunky as possible, which is why we’re designing a case to hold the battery and PCB, as well as experimenting with different gloves.



Somya’s Status Report for 4/6/2024

This week I finished collecting data for all BSL letters. We found out that USB collection was much faster than bluetooth, so initially this process was proving to be quite a bit of a time sink until we made the switch halfway through. In addition, I looked into the haptic feedback circuit and will place an order for the linear vibration motor on Tuesday. The circuitry doesn’t look too complicated, the only thing I’m worried about is defining what haptic feedback even means in the context of our product. Ideally, we would want a different feedback to be generated based on how certain the word the user signed was transmitted, so something along the lines of if >90% a certain type of pulse, <30% another, and a maybe range as well, but I’m not sure if this is even feasible so plan to bring this up next meeting. 

 

My progress is on schedule. 

 

This upcoming week, in addition to implementing haptic feedback, I want to look into how we can augment our data as we transition to BSL. After this initial round of data collection, we are finding that a lot of the signs have very similar degrees of bending, and since we have decided not to go with any additional type of sensors, e.g. touch sensors, this will likely lead to a lower accuracy than ideal. This might involve going through the individual csv files for letters that are very similar and quantifying just how similar/different they are, and manipulating the data post-collection in some way to make them more distinct. Once Ricky trains the model over the weekend, I’ll have a better idea on how to more specifically accomplish this.



Somya’s Status Report for 3/30/2024

This past week, I made some changes to the glove and helped figure out some bugs we were having with the bluetooth. One thing we’re noticing in our sensor data is that we get some discrepancies depending on who is using the glove. This is to be expected, as we all have different hand sizes as well as slight variation in the way we make each sign. I’m trying to come up with ways we can make the data more consistent besides post-processing cleanup. In our weekly meeting, we discussed adding a calibration phase as well as a normalization of the data which should definitely help but I still think securing the sensors at additional points than what they are now will also make a difference. I had a few stacked midterms this past week so while my progress is still on schedule, I didn’t make as much progress as I would have liked. This upcoming week, however, I should be able to dedicate a lot more time to capstone, especially with the interim demo around the corner. 

 

More specifically, this upcoming week I would like to add the haptic feedback code to our data collection scripts. Our current plan for MVP is to have the LED on the Arduino blink when the speaker (either on the computer or the external speaker) outputs the signed letter/word and more importantly, that it outputs it correctly. I think we should color code the output based on the success of the transmission: red for didn’t go through, yellow for possible but might want to resign, and green for successful transmission. I also want to order some vibrating motors because for our final prototype we want to have this type of feedback so the user doesn’t have to constantly look down at their wrist. Finally, I want to bring up changing/adding to what position we deem to be “at rest”. Right now, we just have the user holding up their unflexed hand as at rest, and the model is pretty good at recognizing this state, but this isn’t really practical—people’s at rest is typically with their hands at their side or folded in their lap, or moving around but not actually signing anything. The model sort of falls apart with this notion of at rest, and I think adding this to our training data will make our device more robust. 



Somya’s Status Report for 3/23/2024

I finished fabrication of the second glove as well as work on debugging the transmission of the bluetooth data from the Arduino to the laptop that is running the Python script used to collect the data and compress into a csv file. In addition, I brainstormed various ways we can remove the laptop as being a required component of our final demo. 

My progress is on schedule, but we are slightly behind on the testing of the double glove due to waiting for the second Arduino BLE compute unit to arrive. Once it does arrive however, we should have all the moving parts in place to begin testing integration of data from both gloves immediately. In the meantime we are collectively working on other features like the speaker and haptic feedback, as well as cleaning up of noisy data to improve the ML model.

This next week, I hope to finish the circuit with both the speaker and haptic feedback, as well as be fully finished collecting data for the BSL double-handed alphabet so I can see what issues the synchronization of the two input streams brings and start debugging that. 


Somya’s Status Report for 3/16/2024

What did you personally accomplish this week on the project? Give files or

photos that demonstrate your progress. Prove to the reader that you put sufficient

effort into the project over the course of the week (12+ hours).

This week I attached all the sensors onto the glove and worked on implementing/testing the circuit. After performing unit tests and discussing an expanded range in our weekly meeting, we determined that op-amp use would be the way to go, and through use of op-amps we managed to expand the voltage output from 0.8V to 2V. 

 

Is your progress on schedule or behind? If you are behind, what actions will be

taken to catch up to the project schedule?

My progress is on  schedule. 

 

What deliverables do you hope to complete in the next week? 

 

Now that we are in the data collection phase, I want to do some more research into how we can clean up the data with noise reduction algorithms. In addition, I would like do some more research on the Bluetooth data transmission so we can more on that in parallel along with the data collection, as well as fabrication of glove 2

Somya’s Status Report for 3/9/2024

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

I finished the circuit diagram for the sensor wiring (with two sensors displayed as opposed to five for ease of looking at); this schematic is one that I will keep adding to/modifying as our design needs change. 

In addition, I did the mathematical calculations for determining the options for what resistor we want to use in series with each flex sensor. This is quite important to get right because one of the biggest challenges we anticipate after looking at Gesture Glove’s challenges/advice from our TA is flex sensor sensitivity. To address this, we need an output range of values as wide as possible, so if someone signs an ‘a’ versus a ‘b’ the voltage outputs aren’t something like “1.200V” and “1.215V”. As such, we decided that our ideal voltage range would be 1V-4V. This creates the below inequality:

The tricky thing is about having a variable resistor (i.e. the flex sensor), is that the value that satisfies the two equations that can be formed from the above inequality is negative. So, the best thing you can do in practice is form a range of resistor values and play around with multiple resistors within that range to see which ones produce the widest output range. Through my calculations, I found this range to be . As such, I tested five of the most common resistor values within this range: 2.7kΩ, 3.3kΩ, 4.7kΩ, 10kΩ, and 47kΩ. Of course we will test these in the actual breadboard circuit, but I also manually tested them and found that the range reaches maximum of ~0.8V with ~10kΩ of resistance, which isn’t great. So it’s looking like we will need to use an op-amp. I looked at Spectra Symbol’s data sheet, which listed LM358 or LM324 op-amps as suggested. Below is what the circuit diagram for that would look like: . Lastly, I finished all my tasked sections for the Design Report that was due last Friday.

 

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

My progress is on schedule. 

 

What deliverables do you hope to complete in the next week? 

Now that all of our parts have arrived, I hope to accomplish two tasks in particular. First, I want to do some flex sensor unit testing by building a simple voltage divider circuit and seeing which of the five selected pull-down resistors will give the widest range of V_out. I want to do these units test first so that way if further sensitivity is required by way of an op-amp, I know that before launching into building the entire circuit to accommodate the five flex sensors. Second, after I’ve determined the pull-down resistor through educated trial and error, I would like to have the five flex sensor circuit built by Wednesday. That way we can get to checking if the Arduino code for reading out from the sensors works, and possibly even start the data collection process this week. All our moving parts in preparation for the model training are really close to being done, so hopefully the time we’ve spent preparing everything pays off and data collection goes smoothly. I also want to bring up making a PCB board with the team on Monday’s meeting. 



Somya’s Status Report for 2/24/2024

I did some more research on how we integrate the sensors onto the glove as well as what specific resistor values to initially try in the voltage divider circuit with the flex sensors. The midpoint resistance of the flex sensors is 10,000 ohms, so depending on the voltage of the power supply I will select a range of divider resistors accordingly. In addition, I looked into the various tradeoffs choosing these sensors implies, as I definitely want to touch on that in our Design Report, in addition to the testing and verification protocols for the data readings that we will obtain from the sensors, i.e. what voltage outputs to reject/keep. 

My progress is on schedule.  

By Monday, I should have a circuit schematic of the sensor integration pinout. In addition, I hope to get most of my assigned parts for the Design Report.



Somya’s Status Report for 2/17/2024

In addition to working on the Design Review slides and updating the Gantt chart, I was able to order the flex sensors, so hopefully we can start actually building the glove this upcoming week. Initially I wanted to use Sensor Symbol’s SpectraFlex sensors, but after calculating the cost for 10-12 sensors decided this would put too big of a dent in our budget. As such, we decided to go with Sensor Symbol’s original flex sensors (active sensor length 95mm to span the length of an average-sized finger). The tradeoff demonstrated by this choice is between cost and performance/aesthetics. In addition, I did some research on the word choice we want to demonstrate for our MVP. One of the comments we got from our initial proposal presentation was that the decision to have 26 double-handed words seemed arbitrary. To address this, I found a 2017 paper that identified the top 200 most commonly used words in the ASL vocabulary. They selected 56 ASL words from five word categories (pronoun, noun, verb, adjective, and adverb). 29 words from this subset are double-handed and 27 are one-handed. So, we might take inspiration from this paper and have our MVP vocabulary be based on signed word frequency. 

In terms of progress, my progress is on schedule. 

In terms of this week’s deliverables, I expect all our parts to arrive shortly, so in preparation for that I want to flesh out a circuit diagram for the pinout of the flex sensors with the computing unit and necessary resistors/op-amps. I want to do some research into which op-amps I should select for the best voltage output from the sensors (the op-amp acts as an impedance buffer). I also want to start thinking about the Arduino code we will use to conglomerate the data from the sensors and send them to the computer, as well as how we will test sensor output, e.g. determine a noise threshold to reject/accept voltage outputs based on how noisy they are. 



Somya’s Status Report for 2/10/2024

What did you personally accomplish this week on the project? Give files or

photos that demonstrate your progress. Prove to the reader that you put sufficient

effort into the project over the course of the week (12+ hours).

This week I mainly researched the sensors we plan on using for the gloves. There were a number of factors to consider, including cost, durability, and sensitivity. I compiled a list of sensors from various research groups/previous capstone projects used for similar gesture recognition applications, and then pitched various ideas in one of our team meetings. We have decided to go with the traditional one flex sensor per finger approach to start, namely Spectra Symbol’s SpectraFlex sensors, which are an improved design that are lightweight, less thick, and aim to minimize drift (i.e. unexpected change in sensor output despite the input having returned to an initial state). 

 

Is your progress on schedule or behind? If you are behind, what actions will be

taken to catch up to the project schedule?

My progress is on schedule. 

 

What deliverables do you hope to complete in the next week? 

I plan on helping complete the slides for the upcoming Design Review proposal, as well as research more into how we plan on addressing sensitivity issues with the flex sensors, as this was mentioned in the feedback we received from our initial proposal. In addition, I would like to have our sensors ordered by the end of the week, as well as start thinking about the pinout of the sensors to the Arduino.