This week I worked 2 goals: creating a tap & tap loading animation in OpenCV and making the API for the finger detection more friendly for Connor’s UI. The first goal has gone smoothly, although there is some jitter in the animation due to small fluctuations in the recognized fingertip position. I will be working to improve this in the following week and have a couple ideas – taking a set point to base the animation around at the start and ignoring shifts, or taking a running weighted average to minimize jitter but maintain more animation-accuracy. In regard to the second goal, I spent a lot of time reading about both OOP in Python, as I have not explicitly used Python to create and interact with user-defined classes before. I have also been doing research into how the webcam feed + OpenCV overlays can be integrated into a PyGame interface to inform my decisions about the API. Mainly, I broke what was previously a main-function loop to demonstrate and visualize the performance of the finger detection into a few different functions that represent individual “transactions” with the system (request/response for tap, initialization/calibration). This coming week I will continue to learn about OOP patterns in Python and refactor the code to create and access the module as an object within a orchestrator back-end.