This week, our team got some feedback about lack of complexity, specifically on my end as the webapp wasn’t enough complexity wise. As a result, we reshuffled some responsibilities and assigned new ones. I finished what I planned to do, here is a summary:
I completley developed the pygame display which will be displayed on our glasses, including all the instructions (go forward, go back, show ingredients, elaborate steps). This information will be displayed at all times to the user as a HUD, so I made sure to take extra consideration to provide text wrapping, intuitive controls (ex. locking progression on steps when ingredients are up to reduce amount of input confusion, making sure “back” returns to the beginning of the previous step and not the wrapped text), and easy to edit variables for line amount and textbox sizes in case things change when the program is rendered on the actual display. I also added a progress bar that displays at the bottom of the glasses to show how many steps have been completed. I used arbitrary keyboard inputs as placholders, which Diya then attached to the gesture recognition signals. The display output is now fully hooked up to the gesture recognition.
Tommorow, I need to add the functionality of recipe experience levels (exp gained through finishing recipes) and display them on the profiles on the webapp. Diya is currently experiencing some git issues which have temporarily slowed down progress as integration of the changes on her branch are not merging with the others (something to do with virtual environment issues), but we are resolving that and then I will implement this functionality.
We also discussed what we can implement after the demo, and had a discussion about what exactly we want to do with the networking feature.
I had the idea that to integrate the glasses and the webapp networking features more, we should have a feature that allows the user to pick one other person to cook with before starting the recipe, and once it starts, should be able to see their progress on the recipe live on the glasses. This will require some lower level work with WiFi connections / networking, and will also require some work on the arduino and I2C- so I’m excited to work on this after the interim demo.
EDIT: upon discussion today (sunday march 30th), we have decided that an analytics feature may be more useful and focused on the user requirements and use case. More on this in the next report.
In addition, if the display is able to handle the current display, Diya and I thought it would also be cool to be able to display the recipe selection on the glasses. I will be in charge of this if that happens, which will be another task for after interim demo.