Team Status Report for 10/25/25

This week, we’re making solid progress on our project and staying on schedule, thanks to everyone’s individual contributions. In preparation for Prof. Dueck’s class on Wednesday, we had several parts of our project ready to present and test. The main part was the hand gesture algorithm developed by Lucy. Here is a video from the hand gesture algorithm. We also showcased our designs for the right-hand motion strumming glove.

After testing the hand gesture computer vision component, we received positive feedback regarding its ease of learning, accessibility for users with varying levels of mobility, and low latency. Another key activity this week was recording voice samples from the vocalists, which will be incorporated into the instrument’s sound output. Additionally, the music majors shared ideas on improving the naturalness of chord progressions. We believe this addition can be implemented through a purely software modification that enhances our project.

 

Alexa’s Status Report for 10/25/25

The objective of this phase was to implement direct wireless communication between the ESP32 microcontroller and a Python-based data processing program.

Unlike standard Wi-Fi (TCP/UDP), this setup uses the ESP-NOW protocol — a connectionless communication protocol developed by Espressif. ESP-NOW allows low-latency, peer-to-peer data exchange between ESP32 devices without requiring an access point (router).

Alexa’s Status Report for 10/18/25

After getting feedback that our design needed more implementation details and stronger technical support, I spent some time researching Wi-Fi communication protocols for the ESP32 and ended up choosing ESP-NOW for our communication layer.

I also built an early version of our left-hand detection system using MediaPipe, turning it into a small Python app that could detect which finger is bent on one hand. While testing it, we realized that the one-finger chord-playing idea was a bit too tricky in practice, so we decided to pivot to a simpler approach, and maybe more intuitive for musicians, using hand gesture numbers to match chord numbers within a key.

Alexa’s Status Report for 10/4/2025

This week, along with completing our design presentation, I focused on both ordering the harware and software integration for our project. On the hardware side, I placed orders for key components including the ESP32 microcontroller, an IMU (Inertial Measurement Unit), and capacitive touch sensors. After more research, I also identified and ordered a secondary back-up IMU that meets our project requirements and has much resources on documentation and usage online, which should help speed up integration and troubleshooting.

On the software side, I successfully integrated the MediaPipe hand tracking program with our frontend HTML interface. The system can now detect a left hand in real time through the web application, providing a foundation for finger chord selection. The next step will be to identify which specific finger is being selected. Once finger-level detection is implemented, we can start playing different notes of a chord.

Team status report for 9/27/25

We narrowed down our use case and requirements to be more specific, and pivoted to CV for chord selection on one hand after a few discussions with Jocelyn. On Wednesday, we joined the collaboration class with the school of music (Engineering Creative Interactions: Music + AI, course number, MUS 57584 A2), where we met with John Cohn, Jocelyn, school of music student students, and Music and Technology students. After another brainstorming session, we cemented our MVP and created plans going forward for collaboration with the other members of the class.

Overall, along with working on the design review presentation and report, we also decided on a parts list for working toward our MVP, and looked into a few example repos for hand detection for our CV component.

Alexa’s status report for 9/27/25

This week I did research into the implementation of our project. I read documentation and watched youtube Tutorials on “MediaPipe Hands”, an open source library for identifying locations of point on hands. I also researched ESP32 modules and IMUs. Since the esp32-s3 plus has built in I2C support and the 6-axis IMU we are interested in uses I2C, we hope it will  be easier to get integrated. Also, looked into capacitance touch sensors: whether it’s ones with strips built in, or just control boards that can be connected to conductors. Overall, building a good knowledge base on what parts we need for implementation and ordering parts ASAP.

When we met as a group for the collaboration class with the school of music, I was designated as the task-allocator for getting the other students in this class involved with the group. This will be mostly for future communication about testing and advice on implementation.

Team status report for 9/20/25

This week, we made progress on both the design and planning aspects of the project. We refined our proposal presentation, and presented on Monday. We also held two meeting with Jocelyn over the past two weeks to validate our direction and explore feasibility considerations.

On the sensing side, we are evaluating multiple approaches for detecting hand and finger placement to trigger chords or notes. Current options include capacitive sensing, capacitive touch sensors, CV tracking, or pressure sensors. The selection criteria we are prioritizing are range of expressiveness and ease for use for people who can’t play traditional acoustic instruments.

From a hardware perspective, we revised our implementation plan: instead of designing a custom PCB at this stage, we will use a general-purpose development board (STM Nucleo). This will accelerate prototyping, give us flexibility to iterate quickly on hardware configurations, and enable faster testing of sensing methods without committing to a fixed circuit design.

Our overarching goal remains to build a minimal, testable MVP that validates the sensing approach and instrument interaction before moving on to more advanced hardware iterations.

Alexa’s Status Report for 9/20/25

This week we completed the proposal presentation and I presented on Monday. We’ve been finalizing our design for our instrument through communication with Jocelyn from the school of music. I researched how the Theremin works, using antennas that sense capacitance, and have ideas for chord selection or frequency selection  using capacitance of finger arrangements over one hand. However still working with people from the school of music to gauge how this would work as an instrument.