Andrew’s status report for 9/26

For this week, we worked on our proposal presentation and reviewed the feedback we got from the other team members as well as the professors. There is not too much to update on, we each started our assigned tasks as outlined in our proposal schedule. I personally an on track for my tasks. I reviewed the mediapipe image processing library and have a working demo of the hand tracking software. What remains next is to familiarize myself more with the library to a point where I can fully interact with and port the code I have written with to another application. Nothing else significant to report.

Brian Lane’s Status Report for 9/25

I spent this week researching model designs for gesture recognition as well as further research into datasets, though none were found that would improve upon the data provided by the dataset referenced in our project proposal: https://data.mendeley.com/datasets/ndrczc35bt/1

For model architecture it seems the most common and effective method would be a deep convolutional neural network (Deep CNN),  as convolutional networks are incredibly effective at classification within image data science.

This places me well on schedule, as this week and next are slated for model design and I have set myself up for some experimentation with model hyperparameters this week.

Further, I spent time watching other teams’ design proposals and providing criticism and feedback.

 

Team Status Report for 9/25

This week, our team gave our proposal presentation and reviewed the feedback later on in the week. We started on our individual responsibilities outlined in the schedule with testing out functionality of the hand detection library, exploring datasets for the gesture detection, and looking through documentation for the integration with cursor. It seems like the hand detection library should work very well for us and give us all the information we need to encode hand position as well as potentially gestures. Currently, the biggest risk is still the future integration we are foreseeing between all of our components. There have been no changes to the design and no changes to the schedule as of yet, and we are still comfortably on schedule.

Alan Song’s Status Report for 9/25

This week, my focus for our project was mostly on reflecting on instructor feedback for our presentation and reading documentation for potential libraries we would use for implementing the software side of our project. In class I listened to other presentations and took note of good points and questions that were brought up and asked. I think we have a lot to consider for building our project towards fulfilling more specific requirements. Other teams already seemed to have a good idea of what they would add in their design review, and it gave us things to consider as well. I also looked into the mouse library in python and the win32api library. https://www.thepythoncode.com/article/control-mouse-python and https://codetorial.net/en/pywin32/mouse_control.html. Our project is currently on schedule, and in the next week we are going to finish up our design review and making final selections for every component of our project.

Brian Lane’s Status Report for 9/20

Last week I further refined requirements and use cases for the project through meetings with team members and course staff.

My initial idea of augmented reality document handling was pivoted to be more inline with groupmates interests into a system to interact with a windows desktop environment through user hand movement and gestures.

Further, I did cursory research into potential sensors to accomplish this goal including IMUs, infrared or ultrasonic sensors, and computer vision. When computer vision became the most promising option I began research into hand gesture datasets. This research was then added to my team’s project proposal.

This week I will begin setting up a development environment for initial experiments, as well as start designing the gesture recognition model and adapting the aforementioned gesture dataset for our project.

Team Status Report for 9/18

This week, our team finalized our project idea, the Virtual Whiteboard. After submitting the abstract, we met with Professor Mukherjee and TA Tao Jin to discuss our project further. We talked about narrowing down the use case, including more detailed requirements, and the technology we could use to support our design. We discussed the possibilities of using IMU, IR sensors, and CV. After that our team started working on our Proposal slides for the upcoming presentation. We looked into further details for specific equipment and datasets to use for our CV approach. The biggest risk in our project right now is making sure that the approach we chose (CV) ends up working. To ensure things go smoothly, we are continuing to plan out the development of our project through the proposal. The project is currently on schedule.

Andrew Huang’s Status Report 9/18

This week, we as a team worked on streamlining our design ideas for our project by consulting with Professor Mukherjee and TA Tao Jin. We were able to narrow down our usage environments and had a long discussion on the type of external hardware/libraries that would be most useful for our design direction. Our consensus is to have a mainly CV focused approach, so I myself started looking into CV hand detection implementations and datasets online, and the plan is to get a baseline hand tracking demo with the built in computer monitor by some time next week. We also plan on working on and completing the proposal presentation and are currently on schedule.

Alan Song’s Status Report for 9/18

This week, I mainly worked on refining the idea for our Virtual Whiteboard that we presented in our abstract. We met as a team with Professor Mukherjee and TA Tao Jin to discuss our idea. Based on the feedback, I continued to develop and narrow down our use case to be basically just controlling the mouse in a desktop environment through hand gestures and motion. Our system would be able to move the mouse, left click, right click, and scroll just like a normal peripheral mouse would be able to. Along with this, I also developed more justifications for our requirements. Additionally, I began to look into datasets that we could use with a CV approach for implementing our system. I will also continue to work on the Proposal slides with my team before the deadline. Our project is currently on schedule and we plan on finishing our Proposal presentation and giving the presentation next week, as well as finishing the Peer Review.