Team Status Report for 12/7

In the past week our team made significant progress towards completing DrumLite. As was previously the case the one issue we are still encountering, which at this point is the only real threat to the success of our project, is related to laggy performance when using two drumsticks simultaneously.  Currently, Elliot has written and flashed new firmware to the ESP32’s which we very much hope to resolve the issue. We plan on testing and adjusting our implementation this weekend in preparation for the TechSpark Engineering Expo on Wednesday. Our progress is on schedule and in the coming days we aim to complete our entre integrated system, construct our poster, and start the demo video as to allow for sufficient time around demo day to prepare and write the final report. Additionally, we will conduct further tests highlighting tradeoffs we made for our system which will be included in our poster and final report.

The unit and system tests we’ve conducted so far are as follows:
1.) Ensure that the correct sound is triggered 90% of the time. This test was conducted by running our full system (CV, pad location, BLE, and audio playback) and performing hits on the 4 drum pads sequentially in the order 1, 2, 3, 4. For each impact, the expected audio and actual audio was recorded. Of the 100 samples taken, 89 impacts triggered the correct sound. While this is fairly close to the goal 90%, this led us to realize that we needed a tool allowing the user to select a specific HSV color range to accommodate for varying ambient lighting. Our new design prompts the user to tune the HSV range using HSV sliders and a live video feed of the color mask being applied at the start of each session.

2.) Verify that the overall system latency is below 100ms. This test was conducted by recording a video of performing a series of hits, then analyzing the video and audio recordings to determine when an impact occurred and when the sound was played. The difference between these two time stamps was recorded for each impact and the average was found to be 94ms. This average is however high due to one outlier in the data which had a latency of 360ms. Through further testing we realized that this was a result of our BLE module queueing data when the system was overloaded, and thus triggering an audio event whenever the impact was read from the queue. Elliot thus began implementing new firmware for the microcontrollers that reduces the connection interval and prevents the queuing of events in most cases.

3.) Verify that the latency of the CV module is less than 60ms per frame. This test was conducted by timing each iteration of our CV loop, each of which processes a single input frame. The data plots were plotted and the values average, resulting in a mean latency of 33.2ms per frame.

4.) verify BLE reliability within a 3m range of the user’s laptop. This test was conducted by recording accelerometer data for 20 impacts per drumstick at a distance of 3m from the laptop. By visually inspecting the graph of the accelerometer data, we were able to see that each of the 2o impacts was clearly identifiable and no data was lost.

5.) Verify that accelerometer read + data processing delay is sub 5ms. To conduct this test, we placed two timers in our code. One was placed just before the accelerometer data was read, and the other just before an audio event was triggered. About 100 readings were recorded, plotted, and average, resulting in an average delay of 5ms per impact

6.) Ensure less than 2% BLE packet loss. This test was conducted by creating counters which track packets sent ad packets received for each of the two drumsticks. By verifying that after execution, both counters match (or are within 2% of each other), we were able to ascertain a packet loss rate of ~2%.

7.) Verify BLE round trip latency below 30ms. TO conduct this test we started a timer on the host laptop, sent a notification to the ESP32’s, and measured the time at which the response was received at the host laptop. By averaging these recordings, we determined an average RTT of 17.4ms, meaning the one way latency is about 8.7ms.

8.) Weight requirements (drumsticks combined less than 190.4g and connective components below 14.45g). To verify these requirements we simply weighed the connective components (i.e. wires and athletic tape) as well as the entire drumsticks. The two drumsticks weighed a collective 147g and the connective components weighed 5.1g, both well below our requirements.

9.) Size requirements (minimum achievable layout area of 1644cm^2). To verify this, we configured the 4 drum pads in their minimum layout (all tangent to one another) and measured the height and width in their maximum dimensions. As expected, the layout measured 1644cm^2

Overall we are making good progress, are on schedule, and hope to complete our project this weekend so we can take our time in preparation for the Expo on Wednesday and get a head start on some of the final project submissions in the coming week.

Team Status Report for 11/30

This week, our team mainly focused on implementing several validation and verification tests for our project, as well as finalizing our slides for next week’s Final Presentation.

Validation Tests

  • Drumstick Weight: Both drumsticks weigh 147g, well under the 190.4g limit.
  • Minimum Layout: The drum setup achieved a layout area of 1638.3cm², close to (but still under) the required 1644cm².
  • Drum Ring Detection: A 15mm margin was implemented to reduce overlapping issues, successfully scaling drum pad radii.
  • Reliable BLE Connection: At a 3m distance, all impacts were detected with no packet loss.
  • Correct Sound Playback: The system achieved an 89% accuracy for correct drum sound playback. This slightly missed the 90% target and will thus require refinement.
  • Audio Response: Average latency between drum impact and sound playback was 94.61ms, meeting the 100ms limit (despite a notable outlier).

Verification Tests

  • Connective Components: The drumsticks’ wires and tape weighed only 5g per drumstick, far below the 14.45g limit.
  • Latency: Measured latencies include:
    • BLE transmission: 0.088ms (RTT/2) – under 30ms requirement
    • CV processing: 33.2ms per frame – under 60ms requirement
    • Accelerometer processing: 5ms – under 5ms requirement

We were happy with the majority of these results, as they prove that we were indeed able to meet the constraints that we initially placed on ourselves for this project.

We still are facing a few challenges, however:  we have found that when both drumsticks are connected via Bluetooth, one often experiences noticeably higher latency than the other. The root cause is unclear and under investigation, so resolving this issue is our current/next major priority.

Nonetheless, we have incorporated the results of these tests into our presentation slides and will continue working to resolve the Bluetooth latency issue.

Elliot’s Status Report for 11/16

This week I mainly worked towards incorporating the second BLE drumstick to the system controller and fine-tuning our accelerometer thresholds upon integration with the rest of the code. After wiring the second ESP and accelerometer, I used serial output to find the new MAC address, and I dedicated the laptop’s connection functionality across separate threads for each of the boards. Once I was able to reliably connect to both of the sticks, I met with my group to test the new configuration. At that point, Ben had already spent some time deriving the required behavior for our MPU6050 output waveforms, but we hadn’t yet achieved an appropriate method for distinguishing hits on the table from swinging drumsticks in the air. After testing across various surfaces and static thresholds, I noticed that taking the magnitude of our data readings was not sufficient for identifying hit patterns–the sequence of high-magnitude accelerometer output to low-magnitude was too general to characterize a rebound off the playing surface, and instead triggered a high frequency of false readings while idle. I modified the notification handlers in my bluetooth code to attach the sign of the z-axis to our incoming data, thereby allowing us to identify upward and downward swings independently, and using the graph from our prior testing I was able to set up a sequence for detecting valid spikes. By polling for a sufficient negative reading, blocking with a delay, and then taking a subsequent reading, we were able to correctly identify downward hits given the two-point difference. One issue we found, however, was that while the two sticks operated well separately, one of them would suffer from a noticeable output delay when playing simultaneously. An important point to consider was that for any run, whichever ESP made the connection to the laptop second was the one that experienced the performance drop. This problem could be a result of the bluetooth connection or it could be a flaw in our threading architecture; I am hesitant to blame our BLE connection interval simply due to the fact that both microcontrollers are running the same firmware code, but I plan to resolve this issue within the next few days. Overall, I believe my progress and our team’s progress is on track, and this upcoming week I plan to meet with the team to flesh out the system’s HSV/lighting concerns as well as continue to test the CV module extensively.

Team Status Report for 11/16

This week our team focused on implementing the second Bluetooth drumstick as well as a reliable hit detection mechanism. This led to a few changes in our system controller–first, we refactored the callback functions in the BLE module to more accurately characterize the accelerometer readings, and we also modified the ESP threads to measure hits based off changes in acceleration rather than a static threshold. One risk to the system we are currently facing is the increased latency of the drumsticks when operating simultaneously; our mitigation plan is to explore other threading libraries such as multiprocessing or to research additional functionality in the asyncio module to better handle concurrent execution. In regards to the lighting concerns brought up in last week’s status report, we are in the process of testing with additional overhead lights, which appears to be an effective mitigation strategy for ensuring consistent output.

Our team has run a number of tests thus far in the development process: while much of our system testing is qualitative (measuring audio buffer latency being impractical with timestamps), some of our formal testing includes round-trip time for the BLE, accelerometer waveform generation with the MPU6050’s, and CV frame-by-frame  delay measurements. Additionally, as we move into a fully functional deliverable this week, we plan to conduct an end-to-end latency measurement between the accelerometer spike and the sound playback, and we will also validate our 30mm use case requirement by ensuring that the perimeters for our each of our drum pads remain sensitive to input, even across varying environments, lighting, and drum configurations.

Figure 1. MPU 6050 hit sequence (prior to z-axis signed magnitude changes)

The results of our tests will be compared directly to the use case and design requirements we set out to fulfill, and our goal of diversity and inclusion will be achieved through rigorous testing to reach a comfortable user interface. We do not have new changes in our schedule to report, and we intend to approach a viable product in the coming weeks.

Team Status Report for 11/9

This week we made big strides towards the completion of our project. We incrementally combined the various components each of us had built into one unified system that operates as defined in our use case for 1 drum stick. Essentially, we have a system where when the drumstick hits a given pad, it triggers an impact event and plays the sound corresponding to that drum pad with very low latency. Note however that this is currently only implemented for 1 drum stick, and not both. That will be our coming week’s goal. The biggest risk we identified which we had not anticipated was how much variation in lighting affect the ability of the object tracking module to identify the red colored drum stick tip. By trying out different light intensities (no light, overhead beam light, phone lights, etc.) we determined that without consistent lighting the system would not operate. During our testing, every time the light changed, we would have to capture an image of the drum stick tip, find its corresponding HSV value, and update the filter in our code before actually trying the system out. If we are unable to find a way to provide consistent lighting given any amount of ambient lighting, this will severely impact how usable this project is. The current plan is to purchase two very bright clip on lamps that can be oriented and positioned to equally distribute light over all 4 drum rings. If this doesn’t work, our backup plan is to line each drum pad with LED strips so each has consistent light regardless of its position relative to the camera. The backup is less favorable because it would require that either batteries be attached to each drum pad, or that each drum pad must be close enough to an outlet to be plugged in, which deteriorates our portability and versatility goal defined in the use case.

The second risk we identified was the potential for packet interference when transmitting from two ESP32’s simultaneously. There is a chance that when we try and use two drumsticks, both transmitting accelerometer data simultaneously, the transmissions will interfere with one another resulting in packet loss. The backup plan for this is to switch to WIFI, but this would require serious overhead work to implement. Our hope is that since most of the time impacts from two drum sticks occur sequentially, the two shouldn’t interfere, but we’ll have to see what the actual operation is like this week to be sure.

The following are some basic changes to the design of DrumLite we made this week:
1.) We are no longer using rubber rings and instead using circular rubber pads. The reason for this is as follows. When we detect the drum pad’s locations and radii and use rings, there are two circles that could potentially bet detected: 1 being the outer circle and one being the inner circle. Since the ins no good way to tell the system which one to choose, we decided to switch to a drum pad instead where only 1 circle can ever be detected. Additionally, this also makes getting a threshold acceleration much easier since the surface being hit will now be uniform. This system works very well and the detection output is shown below:

2.) We decided to continually record the predicted drum ring which the drum stick is in throughout the playing session. This way, when an impact occurs, we don’t actually have to do any CV and can instead just perform the exponential weighting on the predicted drum rings to determine which pad was hit.

We are on schedule and hope to continue at this healthy pace throughout the rest of the semester. Below is an image of the whole setup so far:

Team Status Report for 11/2

This week we made significant strides towards the completion of our project. Namely, we got the audio playback system to have very low latency and were able to get the BLE transmission to both work and have much lower latency. We think a a significant reason for why we were measuring so much latency earlier in HH 13xx was because many other project groups were using the same bandwidth and thus causing throughput to be much lower. Now, when testing at home, we see that the BLE transmission seems nearly instantaneous. Similarly, the audio playback module now operates with very low latency. This required a shift from using sounddevice to pyAudio and audio streams. Between these two improvements, our main bottleneck for latency will likely be storing frames in our frame buffer and continually doing object detection throughout the playing session.

This brings me to the design change we are now implementing. Previously we had planned to only do object detection to locate where the tips of the drum sticks are when an impact occurs; we’d read the impact and the trigger the object detection function to determine which drum ring the impact occurred in from the 20 most recent frames. However we now plan to continuously keep track of the location of the tips as the user plays, storing the (x, y) location in a sliding window buffer. Then, when an impact occurs, we will immediately already have the (x, y) locations of the tips for every frame in recent time, and thus be able to omit the object detection prior to playback, and instead simply apply our exponential weighing algorithm to the stored locations.

This however brings us to our greatest risk: high latency for continuous object detection. We have not yet tested a system that continuously tracks and stores the location of the drum stick tips. Thus, we can’t be certain of what the latency will look like for this new design. Additionally, since we haven’t tested an integrated system yet, we also don’t know if even though the individual components seems to have good latency, the entire system will, given the multiple synchronizations and data processing modules that need to interact.

Thus, a big focus in the coming weeks will be to incrementally test the latency’s of partially integrated systems. First, we want to connect the BLE module to the audio playback module so we can assess how much latency there is without the object detection involved. Then, once we optimize that, we’ll connect and test the whole system including the continual tracking of the tips of the drum sticks. Hopefully, by doing this modularly, we can more clearly see what components are introducing the most latency and focus on bringing those down prior to testing the integrated system.

As of now, our schedule has not changed and we seem to be moving at a good pace. In the coming week we hope to make significant progress on the object tracking module as well as test a partially integrated system with the BLE code and the audio playback code. This would be pretty exciting since this would actually involve using drumsticks and hitting the surface to trigger a sound, which is fairly close to what the final product will do.

Team Status Report for 10/26

This week, our team mainly focused on solidifying and integrating our code.

  • Currently, the most significant risks we are facing are persistent, concerning bluetooth and audio latency:

Currently, we are trying to determine a reliable estimate for the one-way bluetooth latency, which would help us massively in determining how much leeway we would have with other components (CV processing, audio generation, etc). This is being done by first sending data to the laptop using an update from the ESP, then sending an update back using a handler in the host code. The one-way latency would then be half of the timestamp taken from start to finish. However, this process is not as simple as it sounds in practice, as having a shared buffer accessed by the server/host and client introduces issues with latency and concurrency. This issue is being managed however, as we still have time blocked out in our Gantt chart to work on data transmission. In a worst-case scenario, we would have to rely on more direct/physical wiring rather than Bluetooth, but we believe this would not be necessary and just need a bit more time to adjust our approach.

Audio latency is also proving to be a slight issue, as we are having issues with overlapping sounds. In theory, it should be the case that each drumstick’s hit on a drum pad should generate a sound individually, rather than waiting for another sound to finish. However, we are currently experiencing the opposite issue, where drum sounds are waiting for another to finish, despite a thread being spawned for each. This situation, if not fixed, could introduce considerable amounts of latency into the response of our product. However, this is a relatively new issue, so we strongly believe that it can be fixed within a relatively short amount of time, if we at least all try to reason about its cause.

  • No changes were made to the existing design of our product. At the moment, we are mainly focused on trying to create solid implementations of each component in order to integrate & properly test them as soon as possible.
  • We have also not made any changes to our schedule, and are mostly on track.

 

Team Status Report for 10/19

In the week prior to fall break and throughout the week of fall break our team continued to make good progress on our project. Having received the majority of the hardware components we ordered, we were able to start the preliminary work for flashing the esp32’s with the Arduino code necessary to relay the accelerometer data to our laptop. Additionally, we made progress in our understanding and implementation of the audio playback module, and implemented the final feature needed for the webapp: a trigger to start the ring detection protocol locally.

Currently, the issues that pose the greatest risk to our teams success are as follows:
1.) Difficulty in implementing the BLE data transmission from the drumsticks to the laptop. We know that writing robust code, flashing it onto the esp32’s, and processing the data in real time could pose numerous issues to us. First, implementing the code and flashing the esp32 is a non-trivial task. Elliot has some experience in the area, but having seen other groups attempting to do similar things and struggle, we know this to be a difficult task. Second, since the transmission delay may vary from packet to packet, issues could easily arise given a situation where a packet takes far longer to transmit than others. Currently our mitigation strategy involves determining an average latency through testing many times over various transmission distances. Once we have this average it should encompass the vast majority of expected delay times. If a transmission falls outside of this range, we plan on simply discarding the late packets and continuing as usual.

2.) Drumstick tip detection issues. While it seems that using the cv2 contours function alongside a color mask will suffice to identify the location of the tips of the drumsticks, their is a fair amount of variability in detection accuracy given available lighting. While we currently think applying a strong color mask will be enough to compensate for lighting variability, in the case that it isn’t we plan on adding a lighting fixture mounted alongside the camera on the camera stand to provide consistent lighting in every scenario.

3.) Audio playback latency. As mentioned in the previous report, audio playback can surprisingly introduce significant amounts of latency to the system (easily 50ms) when using standard libraries such as pyGame. We are now using the soundDevice library instead which seems to have brought latency down a bit. However the issue is not as simple as reducing sample buffer size as we have noticed through experimentation that certain sounds, even if the duration of the sounds don’t vary, require higher buffer sizes than others. This is a result of both the sampling frequency used and the overall length of the data in the sound file. Using soundDevice and by interacting directly with the windows WASAPI (windows sound driver) we believe c=we can cut latency down significantly, but if we can’t we plan on using an external Midi controller which facilitates almost instantaneous sound I/O. These controllers are designed for these exact types of applications and help circumnavigate the audio playback pipeline inherent in computers.

The design of our project has not changed aside from the fact that we are now trying to use (and testing with) the soundDevice library as opposed to pyAudio. However, if soundDevice proves insufficient, we will revert and try employing pyAudio with ASIO. We are still on track with our schedule.

Below are the answers to the thought questions for this week.
A was written by Ben Solo, B was written by Elliot Clark, and C was written by Belle Connaught

A.) One of DrumLite’s main appeals is that it is a cost effective alternative to standard electronic drum sets. As was mentioned in the introduction of our design report, a low end drum set can easily cost between $300 and $700 while a better one can go up to $2000. Globally, the cost of drum sets, whether acoustic or electronic, hinder people from partaking in playing the drums. DrumLite’s low cost (~$150) enables many more people to be able to play the drums without having to worry that the cost isn’t justifiable for an entertainment product.
Furthermore, DrumLite makes sharing drum sets infinitely easier. Previously sharing a drum set between countries was virtually impossible as you’d have to ship it back and fourth or buy identical drum sets in order to have the same experience. But with DrumLite, since you can upload any .wav files to the webapp and use these as your sounds, sharing a drum set is trivial. You can just send an email with four .wav attachments and the recipient can reconstruct the exact same drum set you had in minutes. DrumLite not only brings the cost of entry down, but encourages collaboration and the sharing of music on both a local and global scale.

B.) The design of this project integrates several cultural factors that enhance its accessibility, relevance, and impact across user groups. Music is a universal form of expression found in many cultures, making this project inherently inclusive by providing a platform for users to experience drumming without the need for expensive equipment. Given its highly configurable nature, the system can be programmed to replicate sounds from a variety of drums spanning various cultures, therefore enabling cross-cultural appreciation and learning. This project also holds educational potential, particularly in schools or music programs, where it could be used to teach students about different drumming traditions, encouraging cultural awareness and social interaction through drumming practices seen in other cultures. These considerations collectively make the drumlite set not only a technical convenience but also a culturally aware and inclusive platform.

C.) DrumLite addresses a need for sustainable, space-efficient, and low-impact musical instruments by leveraging technology to minimize material use and environmental footprint. Traditional drum sets require numerous physical components such as drum shells, cymbals, and hardware, which involve the extraction of natural resources, energy-intensive manufacturing processes, and significant shipping costs due to their size and weight. By contrast, our project replaces bulky equipment with lightweight, compact components—two drumsticks with embedded sensors, a laptop, and four small rubber pads—significantly reducing the raw materials required for production. This not only saves on manufacturing resources but also reduces transportation energy and packaging waste, making DrumLite more environmentally-friendly.
In terms of power consumption, the system is designed to operate efficiently with the use of low-power ESP32 microcontrollers and small sensors like the MPU-6050 accelerometers. These components require minimal energy compared to traditional electric drum sets or amplification equipment, reducing the device’s carbon footprint over its lifetime.
DrumLite contributes to a sustainable musical experience by reducing waste and energy consumption, all while maintaining the functionality and satisfaction of playing a traditional drum set in a portable, tech-enhanced format.

Team Status Report for 10/5

For this week, our team worked mainly on the writeup for our design report to fully plan out our final product. We took the time to tackle a few edge cases from our initial blueprint, specifically focusing on the more nuanced details of our design requirements and implementation strategies so that we can better explain our architecture to any reader of the design report. Our schedule remains the same, with Ben developing the web application, Elliot handling the Bluetooth data processing, and Belle covering the computer vision computation onboard the host; we chose, however, to split this week’s stage of our design process differently, with each member focusing on a specific section of the report. We delegated the introduction and requirements to Ben, the architecture and implementation to Elliot, and the testing and tradeoffs to Belle. We decided that this would result in a more well-rounded final product by giving each team member an opportunity to view the project from a holistic perspective before we begin to integrate our modules together. Having each team member dive into other components of the block diagram brought up a few potential concerns we hadn’t considered prior, each of which we then created a mitigation plan for. Some details we worked out this week included the following:

1.) 30mm scalability requirement: As outlined in our proposal and design presentations, one of our use case requirements is to provide the user a 30mm error zone to account for the rubber drumheads deviating from their original position upon impact from the drumsticks. The design requirement we mapped to it for traceability involved deriving a fixed scaling factor to apply to the gathered radii upon detection with the HoughCircles library.  We realized, however, that a single scaling factor across all four drums would not achieve a constant 30mm margin for each drum (as they differ in size), and that the relative diameters in pixels between the drumheads would not be sufficient to determine a scaling factor (an absolute metric is required if our solution is to be applicable for varying camera heights). Hence, our new implementation is to store the absolute sizes of each ring within an internal array and scale based on these known sizes. We can then detect the rings based on their relative sizes, map them to their stored dimensions, and apply a simple separate scaling factor to the radii accordingly. This will prove to be a less error-prone approach as opposed to a purely relative solution where we may have encountered issues if the user did not place all rings in view of the camera, or if the camera was too far from the table to detect small variances in the diameters.

2.) Reliability of BLE packet transmission: Another one of our use case requirements was to ensure a reliable connection within 3m of the laptop, for which we decided to aim for a packet loss of under 2%. Given our original research on the Bluetooth stack and the specifications for the ESP32’s performance, we figured that 2% would be a very reasonable goal. With the second microcontroller also transmitting accelerometer data, however, we run the risk of interference and packet loss, for which we had not developed a mitigation plan. This week, Ben searched for options to lower the packet loss in the event that we do not meet this requirement, eventually landing on the solution of raising the connection interval. Elliot then explored the firmware libraries available in Arduino and confirmed our ability to increase the connection interval with the host device at the cost of over-the-air latency.

3.) Audio output delay: One element we completely overlooked was the main thread’s method of playing audio files, for which we chose to use the pygame mixer. This week, however, our team discovered that this library introduces an unacceptable amount of output latency–we decided to pivot to the use of PyAudio, which is optimized with smaller audio buffers to achieve a much lower processing delay.

4.) Camera specifications: This week, while exploring strategies to most efficiently deploy our computer vision model, we evaluated the effect that a 120 degree field of view camera would have on our CV calculations. We found that wide angle cameras could potentially introduce a form of optical distortion, resulting in stretched pixels and slightly elliptical drumheads, and therefore less precise detection altogether under our framework. We also came to a decision regarding our sliding window, where we chose to now take 0.33 seconds worth of frames before the relevant accelerometer timestamp, since anything higher could lead to potentially false readings. Given these new requirements, we set out to find a high framerate, approximately 90 degree FOV camera, for which we plan to make an order early next week. Below is a diagram we created to help us map out how we’ll use this new field of view:

Next week we plan to stay on schedule and begin working with the physical components we ordered. By Friday, we intend to have a complete report for describing our requirements, strategies, and conscious design decisions in creating our CV-based drumset.

Team Status Report for 9/28

This week our team focused heavily on preparing for the design presentation. This meant not only continuing to build up our ideas and concepts for DrumLite, but in doing so, actually starting to develop some base code to be used for both initial experimentation and proof of concept, but also for testing purposes down the line. We initially struggled with coming up with design requirements base on our use case requirements. We couldn’t really understand the difference between the two initially, but came to the conclusion that while use case requirements were somewhat black boxed and focused more on how the product needs to behave/function, design requirements should focus on the requirements that need to be met implementation-wise in order to achieve the use case requirements.

We then developed our design requirements and directly related them to our use case requirements. In doing so, we also added 3 new use case requirements, which are indicated using * below. These were as follows :
1.) (use case) Drum ring detection accuracy within ≤30 mm of actual placement.

(design)Dynamically scale the detected pixel radii to match the actual ring diameters + 30mm of margin

2.) (use case) Audio response within ≤100ms of drumstick impact.

(design)BLE <30 ms, accelerometer @1000Hz, CV operating under 60ms (~15fps)

3.) (use case) Minimum layout area of 1295 cm2 (37cm x 35cm).

(design) For any given frame, the OpenCV algorithm will provide a proximity check to ensure the correct choice across a set of adjacent rings

4.) (use case*) Play the sound of the correct drum >= 95% of the time.

(design) Average stick location across all processed frames.
Exponential weighting 𝞪 = 0.8 on processed frames.

5.) (use case*) BLE transmission reliability within 10ft from laptop

(design) <=2% BLE packet loss within 10 ft radius of the laptop.

6.) (use case*) Machined drum sticks under 200g in weight.

(design) Esp32 : 31g, MPU-6050: 2g, Drumstick: 113.4g –> Ensure connective components are below 53.6g.

Individually we each focused on our own subdivisions within the project in order to either establish some ground work or better understand the technologies were working with.

Ben: Worked on creating a functioning, locally hosted webapp integrated with UI to allow for drum set configuration/customization, sound file and drum set configuration storage (MySQL), and an endpoint to interact with a users locally running server. He also worked on a functioning local server (also using flask) to receive and locally store sound files for quick playback. Both parts are fully operational and successfully communicate with one another. The next tasks will be to integrate the stored sound files with other code for image and accelerometer processing that also runs locally. Finally, the webapp will need to be deployed and hosted. Risks for this component of the project are centered around how deploying the webapp will affect the ability for the webapp server to communicate with the local one, as currently the both run on local host.

Belle: Focused on developing a testing setup for our object detection system. In order to determine the average amount of time required for OpenCV object detection to identify the location of a drumstick tip, Belle created a MATLAB script in which she drew 4 rings of varying size and moved a red dot (simulating the tip of the drum stick) around the 4 rings. We will use a screen capture of this animation in order to determine a.) the amount of time it takes to process a frame, and b.) subsequently the number of frames we can afford to pass to the object detection module after detecting a hit. Currently, she has a python script using OpenCV that is successfully able to identify the location of the dot as it moves around within the 4 rings. The next step here is to come up timing metrics for object detection per frame.

Elliot: Elliot spent much of his time in preparation for the design presentation, fleshing out our ideas fully, and figuring out how best to explain them. In addition, he worked out how we will communicate with the MCU and accelerometer contained by the drumsticks via python. He additionally looked into BlueToothSerial for interfacing with the ESP32, and confirmed that we can use BLE for sending the accelerometer data, and usb for flashing the microcontroller. Finally, he identified a BLE simulator which we plan on using both for testing purpose and for preliminary research. This simulator accurately simulates how the ESP32 will work, including latency, packet loss, and so fourth.

In regards to the questions pertaining to the safety, social, and economic factors of our project, these were our responses (A was written by: Belle , B was written by: Elliot, C was written by: Ben)

Part A: Our project takes both public health and safety into consideration, particularly with regard to the cost and hazards of traditional drum sets. By focusing on mobility, the design enables users to play anywhere as long as they have a surface for the drum rings – effectively removing the space limitations often encountered with standard drum sets, and offering a more affordable alternative that lowers financial barriers to musical engagement. This flexibility empowers individuals to engage with their music in diverse environments, fostering a sense of freedom and creativity without having to worry about transporting heavy equipment or space constraints. Additionally, the lightweight, compact nature of the rings ensures that users can play without concerns of drums falling, causing injury, or damaging surrounding objects. This design significantly enhances user safety and well-being, and promotes an experience where physical well-being and ease of use are key.

Part B: Our CV-based drum set makes music creation more accessible and inclusive. This project caters to individuals who may not have access to physical drum sets due to space constraints, enabling them to engage in music activities without needing traditional instruments. The solution promotes the importance of music as a means of social connection, self-expression, and well-being. The project also helps foster inclusivity by being adaptable to different sound sensitivities, as you can adjust the sound played from each drum. By reducing the barrier to entry for using drum equipment, we aim to introduce music creation to new audiences.

Part C: As was stated in our use case, drum sets, whether electronic or acoustic, are very expensive (easily upwards of $400). This limits the number of people who are able to engaging in playing the drums greatly simply because there is a high cost barrier. We have calculated the net cost of our product which sits right around $150, nearly a quarter of the price of a what an already cheap drum set would cost. The reason for our low cost is that the components of a physical drum set are much more expensive. Between a large metal frame, actual drums/drum pads, custom speakers, and brain to control volume and customization, the cost of an electronic drum set sky rockets. Our project leverage the fact the we don’t need drum pads, sensors, a frame, or a brain to work; it just needs our machined sticks, a webcam, and access to the webapp. This provides access to drum sets to many individual who would otherwise not be able to play the drums.

We are currently on schedule and hope to receive our ordered part soon so we can start testing/experimenting with the actual project components as opposed to the various simulators we’ve been using thus far. Below are a few images showing some of the progress we’ve made this week:

(The webapp UI –>  link to view image here)

(The dot simulation)