Sophia’s Status Report 11/6

This week I arranged for meetings with Daniel, an ASL user at CMU, and with Danielle, an ASL interpreter. From Daniel, we learned that for fingerspelling, double letters are usually signed with a single fluid movement rather than signing the letters twice. This is something we might want to keep in mind when trying to parse together the words. Next week we plan to meet with him in person and collect data from him to train our model.

Danielle had a lot of feedback poking holes in the use case we defined for our product. She pointed out that the ASL alphabet is used just for proper nouns, and so you can’t really communicate in ASL with just the alphabet.  Furthermore, ASL has five different aspects: palm orientation, placement, location, movement, non-manual signals (body language and facial expression, tone of voice). The glove can’t really pick up the non-manual signals so at best, the glove can translate, but not interpret. She explained to us that a lot of grammar is actually based on non-manual signals in ASL. She also pointed out that ASL users don’t usually like to have things obstructing their hand movement. She shared that she doesn’t like to sign, or feels she signs awkwardly when she wears a ring on her finger. With this input, we should look into a way to measure the resistance the glove gives.

We are on schedule, but we had to readjust the schedule since adjustments to our product required us to recollect training data.

Leave a Reply

Your email address will not be published. Required fields are marked *