Brian Lane’s Status Report for 9/20

Last week I further refined requirements and use cases for the project through meetings with team members and course staff.

My initial idea of augmented reality document handling was pivoted to be more inline with groupmates interests into a system to interact with a windows desktop environment through user hand movement and gestures.

Further, I did cursory research into potential sensors to accomplish this goal including IMUs, infrared or ultrasonic sensors, and computer vision. When computer vision became the most promising option I began research into hand gesture datasets. This research was then added to my team’s project proposal.

This week I will begin setting up a development environment for initial experiments, as well as start designing the gesture recognition model and adapting the aforementioned gesture dataset for our project.

Leave a Reply

Your email address will not be published. Required fields are marked *