This week I reached out to the office of disabilities at CMU to contact actual ASL users. There is a post-doc at CMU which uses ASL who has agreed to talk with us and provide insight into our project. There is also an ASL interpreter interested in helping out with our project.
In addition to reaching out to ASL users, I wrote up a data collection procedure that we can give to people to follow to collect data more autonomously.
After our discussion with Byron, we identified the next steps we need to take to improve our prototype. Right now we are going to focus on improving the classification accuracy. I moved the IMU to the back of the glove in a makeshift way. There are a lot of wires, so it’s quite messy, but it will allow us to see if the change in IMU position helps differentiate between signs.
This upcoming week I need to restitch some of the flex sensors because sometimes when a pose is made, the flex sensors don’t always bend in a consistent way due to the limited stitches. The embroidery thread has finally arrived so I will stitch over each of the flex sensor in this upcoming week.
I think we are on schedule, but I’m not confident we will be able to parse letters into words. However, this was not a functionality we originally defined in our requirements.