Recent Posts

Team Status Report 4/24/21

Team Status Report 4/24/21

This week we worked on the pen location tracking and the GUI. For the pen location tracking, we got a new IMU and were testing out different software elements to get more accurate readings as we were having issues with accuracy and orientation. We attempted 

Jenny Han’s Status Report 4/24/21

Jenny Han’s Status Report 4/24/21

This week I worked on the GUI. I finished the calibration page, main page, and add the gesture page. I am currently working on making it so users can automatically add gestures through the GUI and plan on finishing that feature by the end of 

Jade Wang’s Status Report For 4/24/2020

Jade Wang’s Status Report For 4/24/2020

This week, I worked on automating the hand gesture recognizer so that users can add new gestures and map them to user macros. The program now will take in the new gesture readings and automatically train the model immediately. The program then restarts automatically so that the user does not have to do anything besides enter in gesture snapshots via keyboard presses, and labels for the new gesture and which user macro they want to map the gesture to. Jenny has made progress on the GUI this week as well, so I was able to link the gesture recognizer program with the GUI such that the user can start up the program by using the GUI.

Next week, I will work with Jenny to finish linking the GUI and recognizer fully such that the user can control the recognizer fully from the GUI and keyboard inputs. Below are screenshots of our GUI. Pressing the ‘start’ button starts the recognizer in training mode immediately so user can enter in hand gesture snapshots via keyboard command.

Jenny Han’s Status Report for 4/17/2021

Jenny Han’s Status Report for 4/17/2021

This week I worked on the GUI for the calibration and gesture recognizing page using Tkinter. I also researched more into SciKit Kinematics to understand how to integrate the new IMU with the library functions. This week I will finish the GUI and link the 

Interim Demo

Interim Demo

Attached is the Interim Demo Presentation as well as our new Gantt Chart. Interim Demo

Bradley’s Status Report for 4/10/21

Bradley’s Status Report for 4/10/21

This week, I worked more on the pen tracking algorithm as well as integration with mouse movement. After observations that the kalman filter was quite slow and didn’t result in a noticeably more accurate result, I removed that for now. I also observed that the time for PyAutoGui to move the cursor was around 0.1-0.13 seconds, which is much too slow and does keep up with the rate the arduino gets data. I replaced PyAutoGui with pywin32, which increased the speed by about 10x. A current flaw of our code is that it doesn’t account for gyroscopic rotation yet. Therefore, the tracking only works when held in one orientation. To solve this issue, we decided we needed a more featured IMU. We settled on the BNO055 and ordered that and are waiting for it to arrive. I also researched algorithms to account for the gyroscopic rotation to keep track of orientation.

Jade’s Status Report for 4/10/2021

Jade’s Status Report for 4/10/2021

This, week I worked on both the pen tracking algorithm and the hand gesture recognition algorithm. For the pen tracking algorithm, I tried out a Kalman filter for the acceleration readings from the MPU6050. I tried various parameters but it seemed like the Kalman filter 

Team Status Report for 4/10/2021

Team Status Report for 4/10/2021

This week we worked on the pen tracking software, the gesture recognition, the GUI for the project, and linking the gestures with macros. For the pen tracking software, we updated the double integration so it would integrate over a window of past values. This wasn’t 

Jenny Han’s Status Report for 4/10/2021

Jenny Han’s Status Report for 4/10/2021

This week I worked on the pen tracking algorithm. I found an error in our implementation and updated it so it would integrate over the last 10 data points collected by the IMU. This still had some inaccuracy, so we decided to get a new IMU (arrived today) and will attempt to work on that more the next week.

Secondly, I worked on finding an OpenCV solution for the pen tracking. I think with the unreliablity of the pen accelerometer data, if we combined a software element it would make the overall implementation more reliable (since there are more inputs). However, I was having some issue finding a reliable open source for it. I looked into the MediaPipe software as well as OpenCV software. I think if I cannot find a reliable source for object tracking, we could use a laser instead of a pen against the wall for a similar effect.

Finally, I worked on the GUI of the project and am using TKinter for the calibration, about, home and help pages.

Bradley Zhou’s Status Report for 4/3/21

Bradley Zhou’s Status Report for 4/3/21

This week, I worked more on the pen tracking algorithm. I implemented a multithreaded producer/consumer queue model for the software (producer is arduino sending accelerometer/gyroscope readings and consumer is taking those readings to decide where to move the cursor). We ran into the expected problem