Sophia’s Status Report for 12/4

For the last two weeks, I worked on making some more repairs to the glove and adding touch sensors. The ring finger on the glove was not responding so I switched it out with a new one and stitched it down very tightly. I borrowed a sewing kit from IdEATE and sewed down it down with the full thickness embroidery thread.

I also created a new protoboard to make room for pins for touch sensors.

For touch sensors, I decided to use some conductive tape.

Unfortunately, it seems there are issues with the newprotoboard and some more of the flex sensors are not working. I will need to replace the thumb and fourth finger flex sensors next and debug my work on the protoboard.

Team Status Report for 11/20

This week we implemented the audio portion of our project. Now when a sign is made, the letter will be spoken by the computer. And the audio output is at the rate that we set in the requirements at the beginning of the semester. We also implemented the suggestion we got during interim demos about creating another model to differentiate between letters that are commonly mistaken for others (R, U, V, W and M, N, S, T, E). However, this did not noticeably improve the functionality. We will continue analyzing the data from those easily confused letters to see if we can improve the accuracy. Adding contact sensors may be the best solution.

The Arduino Nano BLE arrived and we are working on integrating Bluetooth into our system so the glove can work wirelessly. There have been some issues uploading scripts to Arduino Nano BLE and will hopefully be resolved next week.

The reading for the flex sensor on the ring finger of the glove also stopped varying (it gives roughly 256 degrees no matter how much you bend it), so we replaced the flex sensor. There is also some calibration we should do in the Arduino script to improve the conversion from voltage to degrees. Now that we have repaired the glove, we will need to recollect data.

We are on schedule since the audio integration is now complete. In the final weeks, we will continue refining the system by changing the hyper-parameters of our model.  We will also create plots showing performance to add to our final presentation, paper, and video.

Sophia’s Status Report for 11/20

This week I repaired the glove and worked on connecting with the Arduino Nano BLE over Bluetooth.

The flex sensor on the ring finger of the glove stopped reading values. There was no variation in reading when the sensor was bent, so I replaced that flex sensor. We also noticed there was some calibration in the Arduino script we could have completed to make the transformation from voltage to angle of bending more accurate. Since we had to repair the glove, we will do the calibration before collecting the next set of data.

The Arduino Nano 33 Sense with BLE arrived. I tried for a considerable amount of time to upload a basic sketch to it, but for some reason, it won’t upload. I’ll continue trying to work with the new Nano next week. I also looked deeper into how to connect the Arduino Nano 33 BLE to the PC over Bluetooth. It turns out BLE (Bluetooth Low Energy) is not compatible with classic Bluetooth and is most easy to interface with over a mobile device. There are smartphone apps with the sole purpose of reading from Arduino Nano BLE’s or other BLE devices. It is a bit more difficult to connect and read its serial output from a PC, but not impossible. One of the drawbacks of connecting this way is that the transmission rate is slower. Once we get the transmission working we can test to see just how much slower the data streams are and if that will affect whether we reach our requirements.

Team’s Status Report for 11/13

This week we had our interim demo presentations. For this presentation, we showed our glove classifying in real-time the ASL alphabet and speaking out the letter. There were some issues with latency as the speech output of the letter would continue to output even when a different letter was being signed. There were also some issues differentiating between similar signs. We received a lot of great feedback for fixing these problems. To fix the latency we will try to improve the handshaking protocol. To improve classification, we can build a model around the letters that are easily confused and use this more specific model anytime the general model classifies a letter that falls into these groups of similar signs.

The professors also recommended we improve our product to work wirelessly so that it is more usable/portable. As per their feedback, we have ordered a Bluetooth-enabled Arduino Nano to replace our current Nano.

Additionally, we are working on collecting data from a wide variety of people. Each team member is taking turns bringing the glove home to collect data off of roommates. We will also bring our glove to Daniel Seita, an ASL user at CMU to train our model.

Finally, we will begin working on the final presentation and report.

Sophia’s Status Report for 11/13

This week we had our interim demos. From the feedback, we have decided to try and make our glove wireless with our remaining time to make it more usable. I ordered a new Arduino Nano with BLE and also some batteries as power sources. Next week, I will need to take the glove and modify it so that the battery can be secured on the glove and correctly connect to the Arduino Nano. I will also need to work on pairing the Arduino Nano and the computer running the script. I also plan to add touch sensors to help differentiate between letters e, m, n, s, and t. This could require us to collect a new set of data, however, after discussing with the others, it seems we could also add the data from the touch sensors as conditionals into the script.

We are on schedule. We adjusted our old schedule to reflect our new plans. The changes are basically that we are extending data collection and making modifications based on feedback.

Rachel’s Status Report for 11/13

This week, I mostly worked on data collection and adapting the scripts to incorporate audio with our script. I found another modules (gtts and playsound) that are able to create and play audio files relatively quickly (without noticeable delay from a user perspective), so we will be using that instead of pyttsx3, which had a really long delay. I added in some handshaking signals between the arduino and python programs, which slowed down the prediction and output rate to be about 0.27 gestures per second, which is significantly below our target of 2 gestures per second. In changing the arduino script back, I noticed that I was sending new line characters, which was being ignored by the script but that line that was sent could have been better used by sending actual data. After fixing that, our glove can make about 17 predictions per second. I am currently working on incorporating the audio properly, so that there isn’t a lag between streamed in data and the outputted audio– for reasons unknown to me at the moment, the handshaking signals I was passing around before are not working.  Since the changes we plan to make in the next couple of weeks do not involve changes in what the data looks like, I also had my house mates collect data for us to train on.

This week, I plan on fully integrating the audio and getting more people to collect data. I will also begin to work on the final presentation slides as well as the final report. I would say we are on track since all that remains is collecting data and training our final model (we are near the end!). We also have ordered a bluetooth arduino nano, which we will have to switch out for our current arduino– this will also require some changes in the scripts that we have been using, but it shouldn’t become a blocker for us.

Sophia’s Status Report 11/6

This week I arranged for meetings with Daniel, an ASL user at CMU, and with Danielle, an ASL interpreter. From Daniel, we learned that for fingerspelling, double letters are usually signed with a single fluid movement rather than signing the letters twice. This is something we might want to keep in mind when trying to parse together the words. Next week we plan to meet with him in person and collect data from him to train our model.

Danielle had a lot of feedback poking holes in the use case we defined for our product. She pointed out that the ASL alphabet is used just for proper nouns, and so you can’t really communicate in ASL with just the alphabet.  Furthermore, ASL has five different aspects: palm orientation, placement, location, movement, non-manual signals (body language and facial expression, tone of voice). The glove can’t really pick up the non-manual signals so at best, the glove can translate, but not interpret. She explained to us that a lot of grammar is actually based on non-manual signals in ASL. She also pointed out that ASL users don’t usually like to have things obstructing their hand movement. She shared that she doesn’t like to sign, or feels she signs awkwardly when she wears a ring on her finger. With this input, we should look into a way to measure the resistance the glove gives.

We are on schedule, but we had to readjust the schedule since adjustments to our product required us to recollect training data.

Sophia’s Status Report for 10/30

This week I reached out to the office of disabilities at CMU to contact actual ASL users. There is a post-doc at CMU which uses ASL who has agreed to talk with us and provide insight into our project. There is also an ASL interpreter interested in helping out with our project.

In addition to reaching out to ASL users, I wrote up a data collection procedure that we can give to people to follow to collect data more autonomously.

After our discussion with Byron, we identified the next steps we need to take to improve our prototype. Right now we are going to focus on improving the classification accuracy. I moved the IMU to the back of the glove in a makeshift way. There are a lot of wires, so it’s quite messy, but it will allow us to see if the change in IMU position helps differentiate between signs.

Open photo

This upcoming week I need to restitch some of the flex sensors because sometimes when a pose is made, the flex sensors don’t always bend in a consistent way due to the limited stitches. The embroidery thread has finally arrived so I will stitch over each of the flex sensor in this upcoming week.

I think we are on schedule, but I’m not confident we will be able to  parse letters into words. However, this was not a functionality we originally defined in our requirements.

Sophia’s Status Report for 10/23

This week I focused on maintenance for the glove. Some of the connections between the flex sensors and the board hosting the Arduino Nano came loose due to poor crimping job so I redid the connection. I also ordered embroidery thread to better secure the flex sensors. The stitches keeping the flex sensors in place currently are coming loose.

I also finalized the gerber, BOM and PLC files required to order the connective PCB. I want to order the PCB from jlcpcb, but I want to hold off on ordering a little longe. The connective PCB design that I plan to order is the same size as the perf board, so there is not much added advantage of having this PCB. I did some research on possibly making the PCB smaller and in the next week I plan to make a second design and then order the connective and new design in one batch to save on shipping costs.

In addition to creating a new PCB design, I plan to reach out to the office of disabilities to contact some fluent ASL users and interpreters who we can test our device on.

I think we are on schedule, but I’m not sure how smoothly replacing the PCB will be if we decide to use it.

 

Sophia’s Status Report for 10/9

This week I focused on building the glove prototype since all of our parts and sensors came in.  I had started at the end of last week and wanted to finish by the end of the weekend but unfortunately it took longer than expected.

No description available.

I soldered the connections on a protoboard. I made sure to place all the components so that the Arduino USB port would connect the USB cord parallel with the arm. And I also soldered 90 degree male pinouts so that the flex sensors can be removed from the Arduino. And The Arduino and IMU breakout are sitting on some female pinouts so they can be easily removed later as well.

My sewing skills are not very good. It was difficult to sew the flex sensors onto the glove in the perfect position so that they remained aligned along the finger when the fingers bent. Some of the stitches will probably have to be redone in the future.

I also wrote the Arduino sketch to read and output all the values from the sensors at the same time.

Since we built a working prototype on protoboard, we’re not sure if we even need to order a PCB. However, we could make the hardware a lot smaller if we do design one.