Taj’s Weekly Reports

9/20

This week my team presented our proposal on Monday. Outside of preparing for that, I spent time learning how theremins function as prior to a week ago I never heard or seen the instrument. Specifically, I looked at how they use electromagnetic fields and antenna capacitance to generate sound. In parallel, I also worked on understanding chords from a musical perspective, since I do not have prior experience with string instruments.

Additionally, I collaborated with my teammates to continue refining our use case, which is evolving as we receive more input from Jocelyn at the School of Music. Jocelyn’s more computer-vision-oriented approach pushed me to start researching Python libraries for CV applications. In particular, I focused on learning the basics of OpenCV, with the goal of exploring whether it can be used to detect finger positions via a camera and map those movements to chord recognition.

 

9/27

This week, on Wednesday, our team met with Dr. Dueck, the rest of the students in the Engineering Creative Interactions: Music + AI course, and also John Cohn. Specific to me, I had a conversation with John about the hardware aspects of our project. He strongly supported the idea of using one hand for strumming with an IMU and the other hand with a computer vision approach for chord selection. He also recommended several hardware options he had previously worked with that could fit our needs, including:

  • ESP32 + IMU
  • M5 Stack (ESP32 C3 + IMU)
  • ESP32 with capacitive touch inputs (5 channels)
  • MPR121 sensor

Following that discussion, I spent time researching these modules and additional sensors for movement tracking. My goal is to finalize options and place an order by early next week.

In parallel, I have also been contributing to our design presentation slides. To improve upon our proposal presentation, I am focusing on clarifying specific metrics and structuring the flow more effectively. In particular, I am ensuring that the presentation builds toward our solution and MVP more gradually, since in our previous presentation we introduced them too early.

 

10/4

This week, I focused on preparing for our design presentation on Monday, which I delivered to the class. The presentation required extra effort since we recently shifted our project to incorporate computer vision, a change that significantly altered our design approach and testing plans.

Following the presentation, I began exploring repositories and tutorials related to interfacing the MPU6050 IMU with an STM32 microcontroller. Since we plan to use the I²C protocol for communication, I concentrated specifically on I²C-focused resources. Two particularly helpful references were this GitHub repository and this tutorial. These resources will prepare me to efficiently collect and process data from the IMU once it arrives.

 

10/18

For the week of Oct 11-Oct 18, I did not complete much project work because I was traveling with my family for fall break. In the prior week, I concentrated on the design report with a focus on hardware details we had not previously specified. I researched the power system and battery connectors, identified libraries and driver requirements for our two candidate IMUs, mapped the I2C wiring and address plan to the ESP32 pinout including pull-ups, and compared two touch-gate implementations: a capacitive touch sensor on I2C versus a force sensing resistor into an ADC channel.

I also defined the on-device feature and packet schema. The ESP32 samples the active IMU at 120 Hz and polls the chosen gate at a higher rate. Each record carries a millisecond timestamp and a scalar strumming-speed feature S computed on-device without full strum segmentation. For each sample, compute angular velocity magnitude |ω| = sqrt(gx² + gy² + gz²), then take the RMS over a 50 ms window and normalize the result to the range [0, 1] for transmission. When the gate is active, the node streams IMU vectors plus S; when inactive, it pauses to save bandwidth and power. The planned ESPNOW packet schema is {ts_ms, imu: {ax, ay, az, gx, gy, gz}, gate_active, S}, where ts_ms is the capture timestamp in milliseconds, imu.ax/ay/az are linear accelerations in m/s², imu.gx/gy/gz are angular velocities in rad/s, gate_active is a boolean indicating whether the Trill or FSR gate is engaged, and S is the normalized strumming-speed feature. The trill is the capacitive touch sensor, and the FSR is the force detecting sensor (Force Sensitive Resistor).

 

10/25

This week, I focused on completing the integration of the IMU and capacitive touch sensor with the ESP32-S3 board to enable simultaneous data collection through a shared I²C bus. Both devices were connected to the same SDA and SCL lines with common 3.3 V and ground connections, and no external pull-up resistors were required because both modules already include internal ones. After confirming that each component worked individually, I wrote and tested a unified Arduino program to acquire data from both sensors at different polling rates. I used several example Arduino programs and reference code from online sources to understand the correct initialization and data reporting methods for both the IMU and the Trill sensor. The final code initializes each device, sets their configurations, and runs two timed service loops. One loop reads quaternion and linear acceleration data from the IMU at 100 Hz, while the other samples the Trill capacitive touch sensor at 20 Hz. The IMU output includes both the game rotation vector for orientation and linear acceleration values for motion, which are integrated in real time to estimate velocity. The program also determines the strumming direction by checking the sign of the Y-axis acceleration, allowing it to distinguish between upward and downward motions. Meanwhile, the Trill sensor reports touch position and size in Centroid mode to measure where and how strongly the surface is touched. When both sensors are active in mode 2, the ESP32 alternates read operations to confirm that the shared I²C bus can arbitrate properly without interference. The integrated setup successfully produced synchronized motion and touch data, confirming stable communication and timing. For next steps, I plan to improve the accuracy of the velocity estimation from the IMU and test whether strumming gestures can be reliably detected across different orientations and directions to achieve more natural gesture-based playability.

 

11/1

This week, I focused on refining the motion-tracking pipeline by shifting from orientation-based speed estimation using the game rotation vector to direct acceleration-based computation. After discussing with Professor Mukherjee, I decided to leverage the three-axis accelerometers built into the BNO085 to calculate translational speed and direction of movement more accurately. This approach avoids cumulative drift from orientation integration and provides a clearer representation of linear motion. I began integrating this new method into the existing ESP32-S3 program, modifying the sensor configuration and data-processing routines to capture acceleration vectors along all three axes. I am currently implementing filtering and integration logic to derive real-time velocity estimates from these readings.

In parallel, the Force Sensitive Resistors (FSRs) arrived, and I began testing them using a simple voltage-divider circuit connected to the ESP32’s ADC input. Initial tests confirm that the FSRs respond proportionally to applied pressure, and I am characterizing their resistance range to determine an appropriate fixed resistor value for consistent sensitivity. I also verified that the ADC sampling and scaling routines are functional for future integration with the main control loop.

Finally, I tested the new Li-ion battery module that arrived this week. I confirmed that the connectors properly fit the ESP32-S3 power input and that the voltage and current ratings are within safe operating limits for the board and sensors. The system successfully powered up from the battery without instability, indicating readiness for portable operation.

 

11/8

This week, I completed the bring up of the IMU and finalized the transition from acceleration based velocity estimation to angular velocity based strum detection. The previous integration approach was too sensitive to drift, so using quaternion derived rotational speed now provides a much more stable signal for detecting wrist twist motions.

I also implemented an acceleration spike detector that allows the system to detect a strum from either a quick twist or a natural strumming motion. These events are filtered, debounced, and timed, then sent as short BLE notifications, which only transmit when a strum occurs rather than continuously sending data. This behavior is achieved using the notify() property, which fits our use case perfectly.

After testing with the School of Music, we found that the system was overly insensitive, missing lighter strums. In some cases, the serial monitor showed that packets were being sent, but no sound was produced, suggesting timing or threshold issues between BLE transmission and sound playback. There were also cases with very sharp motion that it would send multiple strums even though the user only did one strum. We need to figure out how to mitigate these issues before the demo.

On top of refining the thresholds, I will combine the FSR with the IMU to complete more or less the initial MVP as described in our design report.

 

11/15

This week presented unexpected hardware challenges right before our demo. On Sunday night, both the IMU and our primary ESP32-S3 board burned out during testing. Since we didn’t have extra IMUs on hand, I quickly transitioned to our backup ESP board and resoldered all required connections to restore basic system functionality.

Because a replacement IMU was unavailable in time for the demo, I implemented an alternative sensing method using a ball tilt sensor. I mapped its digital output changes to our strum-detection logic, allowing us to register strums reliably enough for the live demonstration. In parallel, I fully integrated the FSR into the input pipeline, tuning its threshold to be sensitive for users with limited hand strength or mobility. With these adjustments, we were able to achieve essentially full system functionality for demo day.

Our second batch of IMUs arrived on Friday. Over the weekend, I will resume IMU integration, focusing on reintroducing angular-velocity-based strum detection and enabling velocity calculations to measure time between strums more accurately. This will bring us back on track toward the original sensing architecture described in the design report.

 

11/22

This week I focused on stabilizing the sensing pipeline and preparing the new hardware for full reintegration. After our replacement parts arrived, I completed the soldering and setup of the new ESP32-S3.

A major portion of my time went into refining the strum detection logic. The previous implementation occasionally produced double notifications during rapid motion changes, especially when both angular and linear acceleration spiked within a short window. I updated the detection algorithm to suppress duplicate triggers by validating that only one strum event is generated per acceleration transition, regardless of whether the trigger originated from angular or linear components. This has noticeably improved reliability during faster playing motions.

I also began working on computing overall strum velocity more accurately. Drift remains an issue when integrating acceleration over time, but I am experimenting with a method to zero out accumulated drift after each detected strum. The goal is to ensure that each strum’s velocity is calculated relative to a consistent baseline, making the drift predictable and therefore more manageable.

Next week, I will focus on integrating velocity directly into the strum-detection algorithm. Instead of relying solely on acceleration-based triggers, the updated pipeline will detect strums based on velocity thresholds, which should provide smoother, more consistent behavior during both slow and fast playing motions. I will also add support for transmitting the computed velocity to the connected CPU, enabling it to calculate time between strums more accurately and modulate sound characteristics based on strum speed.

New Tools

Throughout this project, I had to learn a wide range of new hardware and embedded software concepts. One of the most important skills I developed was becoming comfortable reading and interpreting technical documentation. Because our system involved the ESP32 S3, the BNO08x IMU, the Trill Flex sensor, and the FSR, I spent a significant amount of time working through each device’s datasheet, application notes, and protocol descriptions. Reading documentation became one of my main learning strategies, since it allowed me to understand the behavior and limitations of each component at a detailed level.

Another key strategy was using the built in example projects that come with Arduino libraries. Whenever I installed a new library, I reviewed its example sketches and used them to understand proper initialization, data acquisition, and common usage patterns. These examples helped me rapidly prototype ideas because I could start from working code and adapt it to my needs. I also learned a lot by studying open source GitHub repositories that used similar sensors.

In parallel, the embedded systems course I am taking at the same time reinforced much of what I needed for the project. Concepts such as interrupts, I2C communication, debouncing, timing constraints, and microcontroller architecture were directly applicable. A major learning strategy for me was pairing what I learned in class with hands on experimentation on the actual hardware, which helped solidify each concept.

Finally, I relied heavily on iterative testing and debugging. For issues related to drift, threshold tuning, or timing, I used small experimental changes, logged sensor output, analyzed unexpected behavior, and refined the algorithm step by step. This process helped me build an intuitive understanding of how each sensor behaved in realistic conditions. Overall, my learning came from a combination of thorough documentation reading, hands on trial and error, example based exploration, and direct application of course material.

 

12/6

This week I focused primarily on preparing for our final presentation and completing verification and validation testing with my team. A major portion of my effort went into designing the procedure for quantifying the force needed on the FSR for a strum to be registered reliably. After initial calibration showed that an analog reading of 400/1023 offered the best balance between sensitivity and avoiding accidental triggers, I devised a method to translate this threshold into a physical force. By placing the FSR on a flat surface and incrementally adding known-weight objects, ensuring full surface contact, I observed that the sensor reached a reading of 400 when two AA batteries were placed on it. This corresponded to an activation force of roughly 0.47 N, comfortably below our maximum allowable 2 N threshold. I also assisted with other validation tests, including measuring strum accuracy across various BPM levels.

In parallel with testing, I worked on aesthetic and hardware refinements in preparation for the final demo. One major issue we encountered involved the Li-ion battery rapidly heating the system. After examining the connections, I discovered that the leads exiting the battery’s JST connector were reversed relative to what our board expected. To resolve this, I desoldered and rewired the battery connector so that the black lead mapped to power and the red lead mapped to ground, eliminating the overheating issue. I also helped streamline the physical layout by hiding the breadboard and securing internal wiring so users cannot tamper with or accidentally disconnect components during use.

Finally, I continued refining the velocity pipeline, focusing on reducing drift and ensuring smoother velocity data transmission. These adjustments help stabilize the time between strums feature and prepare the system for reliable performance during the final demo.