Team’s Status Report for 11/6

This week, we implemented our new data collection process (collecting a window size of three snapshots) and collected data from each of our team members. We also integrated the real time testing with the glove so the software system can record the accuracies of each letter tested. We found that letters with movements (J and Z) perform much better and the classifier recognize it more frequently. We also found that the classifier recognized dissimilar letters better but it does get confused on similar letters (e.g., M and N). In order for the classifier to discern between them better, we will need to get more data. Overall, using a window of three for our data points have improved accuracy compared to our older model. It’s also recognizing about 3-4 gestures per second with the new data collection process. This rate is more suitable for actual usage since the average rate of signing is about 2 signs per second.

We also met up with an ASL user and an interpreter. Both gave useful feedback pertaining ethics, use cases, and capability of the glove for our project.

In terms of schedule, we are on the right track. Next week we will be meeting with the ASL users that we have contacted to collect data from them. Ideally, we will be getting people of different hand sizes for data collection. We will also start on refining our classifier to better recognize similar letters and implementing the audio output.

 

Leave a Reply

Your email address will not be published. Required fields are marked *