Recent Posts

Bradley’s Status Report for 5/8/21

Bradley’s Status Report for 5/8/21

This week, I spent time improving the pen tracking algorithm by taking into consideration acceleration data. I used the same methodology as our previous approach (application of rotation matrix + double integration) but instead limited it to a much smaller time scope. The acceleration informs 

Jade’s Status Report for 5/8/2021

Jade’s Status Report for 5/8/2021

This week, I worked with Bradley and Jenny to refine our pen tracking algorithm such that it incorporated the accelerometer data. We decided to use a weighted approach since the accelerometer data exaggerates any errors in distance calculation. I also worked with Jenny to add 

Team Status Report for 5/8/2021

Team Status Report for 5/8/2021

This week we improved our pen tracking algorithm by incorporating acceleration data into pipeline. Professor Kim suggested using a weighted algorithm to do this and thus, we were able to implement acceleration data in our algorithm by playing with the weights of the acceleration vs. openCV/gyroscope outputs.

We also finished creating the rest of our GUI and completing the calibration code.

In terms of our GUI, we added a delete button so that the user could delete the custom hand gestures that they add. Additionally, we completed the code for our callibration mode and linked that with the GUI.

We also spent a fair amount of time soldering our parts together onto a PCB board so that it wouldn’t have to be on a breadboard. However, we were not able to finish this task due to a sudden COVID case in our vicinity.

Finally, we worked on our logistical assignments to prepare for the upcoming week as well — the poster, our final report, and our demo video.

Jenny Han’s Status Report for 5/8/2021

Jenny Han’s Status Report for 5/8/2021

This week we refined our pen tracking algorithm, finalized the GUI, and completed the calibration code. For the pen tracking algorithm, we were able to combine acceleration data, orientation data, and hand tracking from OpenCV to get a reasonable mouse position. After meeting with our 

Final Presentation

Final Presentation

Final Presentation

Jade Wang’s Status Report for 5/1/2021

Jade Wang’s Status Report for 5/1/2021

This week, I finished fully automating the process of adding a new gesture with the GUI. The GUI page now refreshes automatically to display the newly added hand gesture and the user macro that it maps to.

Additionally, we decided to alter our approach for the pen input due to the pen sensor readings not being accurate enough, despite our several attempts at refining the data via different methods of integration, matrices, filters, etc. We decided on using an OpenCV approach in which we track the hand via camera to help with getting a more precise position of the hand holding the pen. Thus, this week I used OpenCV to get the position of the hand and get the center coordinates. In this way, we could use the coordinates along with the gyroscope/orientation data to get an even more precise measurement of where the tip of the pen is.

Next week, we will try to incorporate accelerometer data into the pen + OpenCV data to hopefully get even more precise coordinates. Please look at Jenny’s status report for a demo of the hand tracking and gyroscope data mapping to mouse movement.

Team Status Report for 5/1/21

Team Status Report for 5/1/21

This week, we worked on a new approach for tracking the position of the cursor/pen. Instead of relying on purely accelerometer readings, we decided to use OpenCV to track the user’s hand position and combine it with orientation data from the pen to refine the 

Bradley Zhou’s Status Report for 5/1/21

Bradley Zhou’s Status Report for 5/1/21

This week I worked on focusing more on the orientation of the pen to better estimate the position of the cursor. I used the orientation to indicate which direction the “tip” of the pen would be facing relative to the user’s hand, which presumably would 

Jenny Han’s Status Report 5/1/2021

Jenny Han’s Status Report 5/1/2021

This week I worked on finishing automating the hand gestures with the GUI, the pen tracking with OpenCV, fine-tuning the GUI, and the accuracy of the pen. We ended up changing our plan for the pen tracking to include OpenCV hand tracking to improve the accuracy of our mouse tracking algorithm.

Here are a few demos:

Demo for pen clicking: https://drive.google.com/file/d/1qb6ZgsfIvp1ET2bqQAoe1I6CgzHnUK-n/view?usp=sharing

Demo for hand tracking: https://drive.google.com/file/d/1LxP3QMM8vi1FtG5JySy1Php_KzaJSKU7/view?usp=sharing

Next week I will work on finishing the calibration page (for mouse position and orientation), the final presentation, and improve the accuracy of the pen with a weighted input (OpenCV hand tracking, IMU orientation data, and IMU acceleration data)

Bradley Zhou’s Status Report for 4/24/21

Bradley Zhou’s Status Report for 4/24/21

This week, I spent toying with the new IMU we received, the BNO055. The benefit to this IMU is it features a magnetometer and more accurate orientation readings. We had originally planned on using this new IMU alongside a software library – scikit kinematics, to