Team status report for 9/27/25

We narrowed down our use case and requirements to be more specific, and pivoted to CV for chord selection on one hand after a few discussions with Jocelyn. On Wednesday, we joined the collaboration class with the school of music (Engineering Creative Interactions: Music + AI, course number, MUS 57584 A2), where we met with John Cohn, Jocelyn, school of music student students, and Music and Technology students. After another brainstorming session, we cemented our MVP and created plans going forward for collaboration with the other members of the class.

Overall, along with working on the design review presentation and report, we also decided on a parts list for working toward our MVP, and looked into a few example repos for hand detection for our CV component.

Alexa’s status report for 9/27/25

This week I did research into the implementation of our project. I read documentation and watched youtube Tutorials on “MediaPipe Hands”, an open source library for identifying locations of point on hands. I also researched ESP32 modules and IMUs. Since the esp32-s3 plus has built in I2C support and the 6-axis IMU we are interested in uses I2C, we hope it will  be easier to get integrated. Also, looked into capacitance touch sensors: whether it’s ones with strips built in, or just control boards that can be connected to conductors. Overall, building a good knowledge base on what parts we need for implementation and ordering parts ASAP.

When we met as a group for the collaboration class with the school of music, I was designated as the task-allocator for getting the other students in this class involved with the group. This will be mostly for future communication about testing and advice on implementation.

Team status report for 9/20/25

This week, we made progress on both the design and planning aspects of the project. We refined our proposal presentation, and presented on Monday. We also held two meeting with Jocelyn over the past two weeks to validate our direction and explore feasibility considerations.

On the sensing side, we are evaluating multiple approaches for detecting hand and finger placement to trigger chords or notes. Current options include capacitive sensing, capacitive touch sensors, CV tracking, or pressure sensors. The selection criteria we are prioritizing are range of expressiveness and ease for use for people who can’t play traditional acoustic instruments.

From a hardware perspective, we revised our implementation plan: instead of designing a custom PCB at this stage, we will use a general-purpose development board (STM Nucleo). This will accelerate prototyping, give us flexibility to iterate quickly on hardware configurations, and enable faster testing of sensing methods without committing to a fixed circuit design.

Our overarching goal remains to build a minimal, testable MVP that validates the sensing approach and instrument interaction before moving on to more advanced hardware iterations.

Alexa’s Status Report for 9/20/25

This week we completed the proposal presentation and I presented on Monday. We’ve been finalizing our design for our instrument through communication with Jocelyn from the school of music. I researched how the Theremin works, using antennas that sense capacitance, and have ideas for chord selection or frequency selection  using capacitance of finger arrangements over one hand. However still working with people from the school of music to gauge how this would work as an instrument.