This past week, my time was mainly spent on creating a test to mimic a simplified version of our project. In MATLAB, I made a short video of a small red dot moving in a somewhat-square path over 4 colored rings (a still frame of the video is shown below, as I am not sure how to upload a gif here).
This is supposed to vaguely emulate the behavior of the tip of a drumstick (which we plan to paint red or some other bright color) moving over the drum rings. It is not exact, but the main goal was to just make the dot move around so that I could figure out how to detect it using CV later on. I also made the proportions approximately equal to the real drum rings we will be using.
Then, in VSCode, I wrote a short program using HoughCircles and other numpy and OpenCV functions to read in/process the video, then output one where the red dot is detected in every frame. Said “detection” is indicated by drawing a small neon blue dot over the targeted red one. One can also pause the video by pressing the spacebar to step through and analyze a given frame, or press ‘q’ to close/force quit the output window.
Since the main task for this past week was to work on the computer vision code to detect rings, I would say that I am on track.
In the next week, I would like to measure how long it takes for the red dot to actually be detected in each frame, which will give us a better idea about what latency ranges we can expect when processing the live video feed from the camera in the real-world implementation. I also want to get started on the sliding window that will house a preset number of the most recent frames from the live video feed. Eventually, locating the drumstick tip in each of these frames will help determine which drum sound to make when an accelerometer spike is detected (by making a hit-like motion with the drumsticks).