This week I was able to finish all of the HTML templates for the web application. Currently, I only have a couple of the URLs working to be able to move around the pages and be able to check the templates, meaning that only the alphabet page and the letter A’s learn/test mode are linked with URLs. Furthermore, I have linked real-time video feedback into the web page and have the user download whatever video clip they record of themselves. The website is able to get the video once the user presses the “Start Recording” button. Once the user finishes doing the sign, currently they need to press the “Stop Recording” button for this video to be saved. Here is the link to a pdf showing the HTML templates that we currently have. Apart from that, this week I have also been helping a little bit with the machine learning models by helping Aishwarya test out the models and trying to figure out where it was going wrong. As for my current testing database, I have added 10 more images for each of the signs that I have been in charge of for the past few weeks, leaving a total of 30 images for the signs N to Z and 5 to 9.
Currently, my progress is on schedule since I was able to catch up during spring break. My goal for the next week is to link the rest of the remaining pages e.g. numbers, conversation, and learning. I also hope to be able to have the program automatically stop recording after 5 seconds of the “Start Recording” button being pressed. Apart from that, I also hope to add 10 images for each of the new signs that I have been assigned, e.g. all of the conversational and learning dynamic signs.