Jade Wang’s Status Report for 5/1/2021

This week, I finished fully automating the process of adding a new gesture with the GUI. The GUI page now refreshes automatically to display the newly added hand gesture and the user macro that it maps to.

Additionally, we decided to alter our approach for the pen input due to the pen sensor readings not being accurate enough, despite our several attempts at refining the data via different methods of integration, matrices, filters, etc. We decided on using an OpenCV approach in which we track the hand via camera to help with getting a more precise position of the hand holding the pen. Thus, this week I used OpenCV to get the position of the hand and get the center coordinates. In this way, we could use the coordinates along with the gyroscope/orientation data to get an even more precise measurement of where the tip of the pen is.

Next week, we will try to incorporate accelerometer data into the pen + OpenCV data to hopefully get even more precise coordinates. Please look at Jenny’s status report for a demo of the hand tracking and gyroscope data mapping to mouse movement.



Leave a Reply

Your email address will not be published. Required fields are marked *