Alan’s Status Report for 12/4

This week and last week, I started doing system and component testing of our project and worked on and presented the Final Presentation for our group. For our metrics, I ran a bunch of trails of our clicking detection and jitter distance tests to contribute results to our group average. I also helped in contributing code to increase smoothness of mouse movement (by averaging positional inputs) and increase practical detection of gestures (by adjusting the number of detections needed to execute a mouse function). The coming week, I will continue to gather metrics for the other mouse functions and work on the final deliverables. I will continue adjusting numbers in our code to get us the best measurements possible so that we can surpass our requirements. I will also get new users such as my roommates to participate in user testing. I am on schedule and should be good to meet the final deadlines and contribute towards giving a successful final demo.

Alan’s Status Report for 11/20

This week, I finished implementing quality of life changes for our current mouse functions as well as some test code for implementing the other mouse functions such as right clicking and scrolling. I added a method of differentiating between clicking and holding by sampling gestures over time and determining which one to do based on if the click gesture was detected multiple times over a half a second time frame. I also experimented with other mouse functions such as right clicking (works the same as left clicking) and scrolling (instead of using mouse.move, mouse.wheel is used). mouse.wheel is interesting in the sense that the scrolling sensitivity is determined by how often the function is called, so the change in y positional value of the hand is used to determine how often the function is called instead of directly assigning a “distance” to scroll. Since we have a working system, for now the team and I will focus on gathering trade offs and metrics to prepare for the final presentation after Thanksgiving break. I am on schedule and will turn my focus towards finishing the implementation of other mouse functions once the gesture recognition model is more accurate as well as noting trade offs and collecting metrics for the final presentation.

Alan’s Status Report for 11/13

This week, most of my work was done in preparation for the Interim Demo. We managed to successfully demonstrate the mouse movement as we planned but we even made progress with integrating the gesture recognition model to allow for left mouse clicking and dragging. While Brian continues to refine the model, I will work on implementing the other mouse operations such as right clicking and scrolling. Right now I am working on making quality of life changes to the mouse functions. One example of this has to do with clicking vs dragging using the mouse. Due to the nature of the mouse module functions, if the gesture for holding the mouse is continuously input, it makes it somewhat difficult to differentiate a user who wants to drag something around for a short while, or a user who wants to click. I am working on implementing code that differentiates between these by sampling gesture inputs over a short period of time. This will be implemented along with a sort of “cooldown timer” on mouse clicking. This will prevent the module from accidentally sending spam click calls to the mouse cursor as it continually detects the gesture for clicking, and instead allows for a user to click once every half a second or so. These changes are being made to ensure a smoother user experience. Currently I am on track with the schedule.

Alan’s Status Report for 11/6

This week, I continued to develop the mouse movement module and worked on a calibration module to prepare for the Interim Demo. The mouse movement module was updated to allow for different tracking sensitivity for users at different distances from our input camera. Additionally, now if the computer vision fails to detect the hand at any time, the mouse stays in place and will continue movement from the same location instead of jumping around once the hand is detected in a different location. This update video shows the new mouse movement at a larger distance. As seen in the video, even from across my room which is around 8-10 feet from the camera, the mouse movement is still able to precisely navigate over the minimize, resize, and exit buttons in VSCode. The cursor also stays in place whenever the hand is not detected and continues moving relative to the new location of hand detection.

Even with the poor webcam quality that misses some hand detection, the motion of the cursor is still somewhat smooth and does not jump around in an unwanted manner.

For the demo, I will also have a calibration module ready that will automatically adjust the sensitivity for users based on their maximum range of motion within the camera’s field of view. Currently, I am on schedule and should be ready to show everything that we have planned to show for the Interim Demo.

Alan’s Status Report for 10/30

This week, I first created an OS Interface document to help plan for the OS Interface implementation while waiting on progress with other modules.

Here is the OS Interface document.

Andrew finished enough pose estimation code for me to start developing the mouse movement portion of the OS Interface with actual camera input data. I started using the pose estimation data with my mouse movement test code to work on mouse movement.

One problem that I found that I am currently working on fixing is the fact that if the pose estimation stops detecting the hand and then detects it again at a different location, this might lead to drastic cursor movement. Instead, this should keep the cursor in the same location and start relative movement once it detects the hand again, instead of moving from the previous location to the location of the new detection. I hope to get this fixed and also possible improve the smoothness of cursor movement before this coming Monday. I am on pace with our new schedule and hope to get a functioning mouse module with movement tracking with hand detection and the calibration feature ready for the Interim Demo.

Alan’s Status Report for 10/23

Status Report for 10/16:

Last week, I mainly worked on writing the Design Review Report. I wrote the abstract and sections I, II, III, VI,  and VII. I also wrote the OS Interface portions of sections IV and V. I formatted the report and added in my teammate’s contributions and then submitted.

Status Report for 10/23:

This week, I had a meeting with Tamal early in the week to discuss the midsemester quantitative feedback that I received. We talked about the Design Review Report, where I learned that the biggest issue was the lack of figures, which will be something we add for the final report. Additionally, we talked about the future of the project and how I could adjust my own personal contributions. We agreed that the OS Interface portion of the project was not too big of a challenge by itself, and that I should instead contribute more to the other two parts of the project (hand detection and gesture recognition). I was also tasked with creating an OS Interface document illustrating the finite state machine as well as the interface for each functional module in the OS Interface. This includes the calibration step and the running system state. I foresee some scheduling changes that will most likely be included in the next week’s status report due to the adjustment of my responsibilities.

Alan’s Status Report for 10/9

This week, since I was ahead of my personal schedule and with deadlines for the Design Review Report coming up, most of my time was focused on contributing to writing parts of the Design Review Report. I learned a lot from all of the different groups’ presentations and from the instructor feedback and have been working on incorporating suggestions and clarifying points about our project in the Design Review Report. In the next week, I will finish writing my portion of the Design Review Report and then start working with Andrew or Brian on getting their components of the project up and running before continuing with my own OS interface portion.

Alan Song’s Status Report for 10/2

This week, I spent most of time playing around with different libraries and APIs for controlling the mouse cursor, as well as working on our Design Review slides. I tested out pywin32, pyautogui, and mouse modules for Python. I found that while pywin32 and pyautogui also provided the necessary functionality for moving and clicking the mouse, the mouse module for Python was the easiest to work with and produced code that was easy to understand with its great attribute naming. Once I settled on using the mouse library, I made some test programs for simple actions such as moving the mouse to absolute and relative positions based on inputted number data (which will eventually be position data received from the hand detection algorithm). The clicking and scrolling functionality was much simpler to get work with and will require input from the gesture detection algorithm to be transformed into simple numerical inputs to trigger the different mouse actions. Additionally, I worked with the team on the Design Review slides, mostly focusing on fleshing out my own section about the mouse module but also participating in discussions with the team about the design overall. I am still on schedule with my own work but after the meeting with the professor and TA this week I think we can modify our schedule a bit to extend harder tasks and shrink the easier tasks instead of just assigning tasks to a 1 week length. In the next week, I will likely be assisting Brian or Andrew with getting one of their components started with development since I am a bit ahead of my own schedule.

Alan Song’s Status Report for 9/25

This week, my focus for our project was mostly on reflecting on instructor feedback for our presentation and reading documentation for potential libraries we would use for implementing the software side of our project. In class I listened to other presentations and took note of good points and questions that were brought up and asked. I think we have a lot to consider for building our project towards fulfilling more specific requirements. Other teams already seemed to have a good idea of what they would add in their design review, and it gave us things to consider as well. I also looked into the mouse library in python and the win32api library. https://www.thepythoncode.com/article/control-mouse-python and https://codetorial.net/en/pywin32/mouse_control.html. Our project is currently on schedule, and in the next week we are going to finish up our design review and making final selections for every component of our project.

Alan Song’s Status Report for 9/18

This week, I mainly worked on refining the idea for our Virtual Whiteboard that we presented in our abstract. We met as a team with Professor Mukherjee and TA Tao Jin to discuss our idea. Based on the feedback, I continued to develop and narrow down our use case to be basically just controlling the mouse in a desktop environment through hand gestures and motion. Our system would be able to move the mouse, left click, right click, and scroll just like a normal peripheral mouse would be able to. Along with this, I also developed more justifications for our requirements. Additionally, I began to look into datasets that we could use with a CV approach for implementing our system. I will also continue to work on the Proposal slides with my team before the deadline. Our project is currently on schedule and we plan on finishing our Proposal presentation and giving the presentation next week, as well as finishing the Peer Review.