Belle’s Status Report for 12/7

This week was mainly spent working on our final poster, as well as helping to integrate Elliot’s changes to the bluetooth code into the current codebase (as my final presentation on Monday went quite well, but I did claim multiple times that Elliot’s changes would fix most (if not all) of our dual-stick lag issues). I also created a couple of graphics for the poster and cleaned it up overall, referencing posters that did well in previous years so that our upcoming presentations can go as smoothly as possible.

Since we are still supposed to be integrating all of our components and making final changes/tweaks to our system, I believe we are on track. Next week, I hope to finish the final poster and practice my presentation skills for both the TechSpark expo and the 18500 showcase, as well as complete the final poster/video.

Elliot’s Status Report for 12/7

This week was spent improving and testing the Bluetooth code to verify its performance. Following last week’s changes, there appeared to be some lingering interference between the two sticks (~100ms delay) when playing together. Paired with the CV processing, I concluded that this was not acceptable latency, so I changed the firmware and host code to run BT Classic rather than BLE. The main changes involved identifying serial ports for each of our ESPs as well as advertising and sending data as a byte array with the BluetoothSerial library rather than notifying with BLEdevice. The laptop code was relatively simple to implement, considering I only needed to work with serial input similar to UART. Following these changes, I set up a test to verify that the interference between the two sticks was lowered; I used matplotlib to take timestamps between when the accelerometer data was first received and when the sound output played, and I overlayed the delays of the two sticks to compare their relative performances.

Test Results (<6ms delay for both drumsticks)

I’m satisfied with these results, since this was our most persistent issue and both sticks are now operating at similar performance. It also gives us an opportunity to highlight the design tradeoffs between lower power consumption and reduced interference on the 2.4GHz band for BLE and BT Classic, respectively. I believe our team is on track to have a fully working project before the demo date, and I plan to spend this week extensively testing the system in various environments and conditions to locate any edge-case bugs.

Ben Solo’s Status Report for 12/7

Following our final presentation on Monday, I spent this week working on  integrations, making minor fixes to the system all around, and working with Elliot to resolve out lag issues when using two drumsticks. I also updated our frontend to reflect the actual way our system is intended to be used. Lastly, I made a modification to the object detection code that prevents swinging motions outside of the drum  pads from triggering an audio event. I’ll go over my contributions this week individually below:
1.) I implemented a sorting system the correctly maps the detected drum pads to the corresponding sound via the radius detected. The webapp allows the user to configure their sounds based on the diameter of the drum pads, so its important that when the drum pads are detected at the start of  playing session, they are correctly assigned to loaded sounds, instead of being randomly paired based on the order they are encountered. This entailed writing a helper function that scales and orders the drums based on their diameters, organizing them in order of ascending radius.

2.) Elliot and I spent a considerable amount of time on Wednesday trying to resolve our two stick lag issues. He had written some new code to be flashed to the drumsticks that should have eliminated the lad by fixing an error we had initially coded where we were at time sending more data than the window could handle. E.g. we were trying to send readings every millisecond but only had a window of 20ms which resulted in received notifications being queued and processed after some delay. We had some luck with this new implementation, but after testing the system multiple times, we realized that the performance seemed to be deteriorating with time. We are now under the impression that this is a result of diminishing power in our battery cells, and are now planning on using fresh batteries to see if that resolves our issue. Furthermore, during these tests I realized that our range for what constitutes an impact was too narrow, which often times resulted in a perceived lag because audio wasn’t being triggered following an impact. To resolve this, we tested with a few new values, settling on a range of 0.3 m/s^2 to 20m/s^2 as a valid impact.

3.) Our frontend had a bunch of relics from our initial design which either needed to be removed or reworked. The main two issues were that the drum pads were being differentiated via color as opposed to radius, and that we still had a button saying “identify my drum set” on the webapp. The first issue was resolved by changing the radii of the circles representing the drum pads on the website and ordering them in a way that corresponds with how the API endpoint orders sounds and detected drum pads at the start of the session. The second issue regarding the “identify drum set” button was easily resolved by removing the button that triggers the endpoint for starting the drum pad detection script. The code housed in this endpoint is still used, but instead of being triggered via the webapp, we set our system up to run the script as soon as a playing session starts. I thought this design made more sense and made the experience of using our system much more simple by eliminating the need to switch back and fourth between the controller system and the webapp during a playing session. Below is the current updated frontend design:

4.) As it was prior to this week, our object tracking system had a major flaw which had previously gone unnoticed: when the script identified that the drum stick tip was not within the bounds of one of the drum rings, it simple continued to the next iteration of the frame processing loop, not updating our green/blue drum stick location prediction variables. This resulted in the following behavior.:
a.) impact occurs in drum pad 3.
b.) prediction variable is updated to index 3 and sound for drum pad 3 plays.
c.) impact occurs outside of any drum pad.
d.) prediction variable is not updated with a -1 value and thus plays the last know drum pad’s corresponding sound, i.e. sound 3
This is an obvious flaw that causes the system to register invalid impacts as valid impacts and play the previously triggered sound. To remedy this, I changed the CV script to update the prediction variables with the -1 (not in a drum pad) value, and updated the sound player module to only play a sound if the index provided ins in the range [0,3] (i.e. one of the valid drum pad indices). Our system now only plays a sound when the user hits one of the drum pads, playing nothing is the drum stick is swung our hit outside of a drum pad.

I also started working on our poster, which needs to be ready a little earlier than anticipated given our invitation to the Tech Spark Expo. This entails collecting all our graphs/test metrics and figuring out how to present then in a way that conveys meaning to the audience given our use case. For instance, I needed to figure out how to explain that a graph showing 20 accelerometer spikes verifies our BLE reliability within a 3m radius of our receiving laptop.

Overall, I am on schedule and plan on spending the coming weekend/week refining our system and working on the poster, video, and final report. I expect to have a final version of the poster done by Monday so that we can get it printed in time for the Expo on Wednesday.

Team Status Report for 12/7

In the past week our team made significant progress towards completing DrumLite. As was previously the case the one issue we are still encountering, which at this point is the only real threat to the success of our project, is related to laggy performance when using two drumsticks simultaneously.  Currently, Elliot has written and flashed new firmware to the ESP32’s which we very much hope to resolve the issue. We plan on testing and adjusting our implementation this weekend in preparation for the TechSpark Engineering Expo on Wednesday. Our progress is on schedule and in the coming days we aim to complete our entre integrated system, construct our poster, and start the demo video as to allow for sufficient time around demo day to prepare and write the final report. Additionally, we will conduct further tests highlighting tradeoffs we made for our system which will be included in our poster and final report.

The unit and system tests we’ve conducted so far are as follows:
1.) Ensure that the correct sound is triggered 90% of the time. This test was conducted by running our full system (CV, pad location, BLE, and audio playback) and performing hits on the 4 drum pads sequentially in the order 1, 2, 3, 4. For each impact, the expected audio and actual audio was recorded. Of the 100 samples taken, 89 impacts triggered the correct sound. While this is fairly close to the goal 90%, this led us to realize that we needed a tool allowing the user to select a specific HSV color range to accommodate for varying ambient lighting. Our new design prompts the user to tune the HSV range using HSV sliders and a live video feed of the color mask being applied at the start of each session.

2.) Verify that the overall system latency is below 100ms. This test was conducted by recording a video of performing a series of hits, then analyzing the video and audio recordings to determine when an impact occurred and when the sound was played. The difference between these two time stamps was recorded for each impact and the average was found to be 94ms. This average is however high due to one outlier in the data which had a latency of 360ms. Through further testing we realized that this was a result of our BLE module queueing data when the system was overloaded, and thus triggering an audio event whenever the impact was read from the queue. Elliot thus began implementing new firmware for the microcontrollers that reduces the connection interval and prevents the queuing of events in most cases.

3.) Verify that the latency of the CV module is less than 60ms per frame. This test was conducted by timing each iteration of our CV loop, each of which processes a single input frame. The data plots were plotted and the values average, resulting in a mean latency of 33.2ms per frame.

4.) verify BLE reliability within a 3m range of the user’s laptop. This test was conducted by recording accelerometer data for 20 impacts per drumstick at a distance of 3m from the laptop. By visually inspecting the graph of the accelerometer data, we were able to see that each of the 2o impacts was clearly identifiable and no data was lost.

5.) Verify that accelerometer read + data processing delay is sub 5ms. To conduct this test, we placed two timers in our code. One was placed just before the accelerometer data was read, and the other just before an audio event was triggered. About 100 readings were recorded, plotted, and average, resulting in an average delay of 5ms per impact

6.) Ensure less than 2% BLE packet loss. This test was conducted by creating counters which track packets sent ad packets received for each of the two drumsticks. By verifying that after execution, both counters match (or are within 2% of each other), we were able to ascertain a packet loss rate of ~2%.

7.) Verify BLE round trip latency below 30ms. TO conduct this test we started a timer on the host laptop, sent a notification to the ESP32’s, and measured the time at which the response was received at the host laptop. By averaging these recordings, we determined an average RTT of 17.4ms, meaning the one way latency is about 8.7ms.

8.) Weight requirements (drumsticks combined less than 190.4g and connective components below 14.45g). To verify these requirements we simply weighed the connective components (i.e. wires and athletic tape) as well as the entire drumsticks. The two drumsticks weighed a collective 147g and the connective components weighed 5.1g, both well below our requirements.

9.) Size requirements (minimum achievable layout area of 1644cm^2). To verify this, we configured the 4 drum pads in their minimum layout (all tangent to one another) and measured the height and width in their maximum dimensions. As expected, the layout measured 1644cm^2

Overall we are making good progress, are on schedule, and hope to complete our project this weekend so we can take our time in preparation for the Expo on Wednesday and get a head start on some of the final project submissions in the coming week.

Ben Solo’s Status Report for 11/30

Over the last week(s), I’ve focused my time on both creating a tool for dynamically selecting/tuning precise HSV values and conducting tests on our existing system in anticipation of the final presentation and the final report. My progress is still on schedule with the Gant Chart we initially set for ourselves. Below I will discuss the HSV tuning tool and tests I conducted in more detail.
As we were developing our system and testing the integration of our CV  module, BLE module, and audio playback module, we quickly realized that the approach of finding uniform and consistent lighting was pretty much unattainable without massively constraining the dimensions to which the drum set could be scaled. In other words, if we used a bunch of fixed lights to try and keep lighting consistent, it would a.) affect how portable the drum set is and b.) limit how big you could make the drum set as a result of a loss in uniform lighting the further the drum pads move from the light source. Because the lighting massively impacts the HSV values used to filter and detect the drum stick tips in each frame, I decided we needed a tool that allows the user to dynamically change the HSV values at the start of the session so that a correct range can be chosen for each unique lighting environment. When our system controller (which starts and monitors the BLE, CV, and audio threads) is started, it initially detects the rings locations, scales their radii up by 15mm and shows the result to the user so they can verify the rings were correctly detected. Thereafter, two tuning scripts start, one for blue and one for green. In each case, two windows pop up on the user’s screen, one with 6 sliding toolbar selectors and another with a live video feed from the webcam with the applied blue or green mask over it. In an ideal world, when the corresponding blue or green drumstick is held under the camera, only the colored tip of the drumstick should be highlighted. However, since lighting changes a lot, the user now has the ability to alter the HSV range in real time and see how this affects the filtering. Once they find a range that accurately detects the drum stick tips and nothing else, the user hits enter to close the tuning window and save those HSV values for the playing session.  Below is a gif  of the filtered live feed window. It shows how initially the HSV range is not tuned precisely enough to detect just the tip of the drumstick, but how eventually when the correct range is selected, only the moving tip of the drumstick is highlighted.

Following building this tool, I conducted a series of verification and validation tests on parts of the system which are outlined below:

1.) To test that the correct sound was being triggered 90% of the time I conducted the following tests. I ran our whole system and and played drums 1, 2, 3, 4 in that order repeatedly for 12 impacts at a time, eventually collecting 100 samples. For each impact, I recorded what sound was played. I then found the percentage of impacts for which the drum pad hit corresponded correctly to the sound played and found this value to be 89%.

2.) To verify that the overall system latency was below 100ms, I recorded a video of myself hitting various drum pads repeatedly. I then loaded the video into a video editor and split the audio from the video. I could then identify the time at which the impact occurred by analyzing the video and identify when the audio was played by finding the corresponding spike in the audio. I then recorded the difference between impact time and playback time for each of the impacts and found an average overall system latency of 94ms. While this is below the threshold we set out for, most impacts actually have a far lower latency. The data was skewed by one recording which had ~360ms of latency.

3.) To verify that our CV module was running in less than 60ms per frame, I used matplotlib to graph the processing time for each frame and found the average value to be 33.2ms per frame. The graph is depicted below. 

I conducted several other more trivial tests, such as finding the weight of the drumsticks, the minimum layout dimensions, and verifying that the system can be used reliably within a 3m range of the laptop, all of which yielded expected results as outlined in our use case and design requirements.

In response to the question of what new tools or knowledge I’ve had to learn to progress through our capstone project, Id say that the two technologies I had to learn about and learn how to implement were CV (via openCV and skimage), and audio playback streaming (via pyAudio). I had never worked with either of them before so it definitely took a lot of learning before I was able to implement any strong, working code. For CV, I’d say I probably learned the most from reading other people (especially Belle’s) initial CV code. Her code used all the techniques I needed to use for building my HSV range selecting tool as well as the module I wrote for initially detecting the locations and radii of the drum pads. I read through her code as well as various other forums such as stack overflow whenever I encountered issues and was able to learn all I needed in order to implement both of these modules. In the case of audio playback streaming, I’d say I learned it mostly through trial and error and reading on stack overflow. I probably went through 6 iterations of the playback streaming module before I found a solution with low enough playback speed. Because many other applications such as drum machines or electronic synthesizers encountered many of the same issues as I was when trying to develop an efficient playback  module, there was a large amount of information online, whether that be regarding using pyAudio streaming or overall concepts on low latency audio playback (such as preloading audio frames, or initializing audio streams prior to the session)

In the coming week the most pressing goal is to determine why playing with two connected drumsticks at once if resulting in sch a drop in performance. Once we figure out wat this issue is, I hope to spend my time implementing a system that can reliably handle two drumsticks at once. Additionally, I hope to start working on either the poster or video as to alleviate some stress in the coming two weeks before capstone ends.

Elliot’s Status Report for 11/30

This week I worked more on the system’s Bluetooth component and gathered verification metrics in preparation for our final presentation. One of the team’s design requirements is to ensure a packet loss under 2% for the BLE, so I performed a packet counting test between the MCUs and the laptop. The strategy was to have a counter incremented before notifying the central device of incoming data, and conversely have a separate counter to be incremented upon entry to that ESP’s corresponding notification handler. I ran this test for five minutes with the two sticks communicating simultaneously, and by taking the difference of the two counters I came to a packet loss of 2.01 percent (26,815 packets received vs. 27,364 packets sent). This was a surprisingly high data loss for our use case, leading me to confirm that our issues with latency were most likely stemming from the wireless transmission. Looking back at my firmware implementation, the latent drumstick would delay for a few seconds, then output a stream of data like this:

This was ultimately a hint that the ESP was forced to retransmit as well as queue packets across multiple connection intervals. After reading more about the Bluetooth stack, I realized that a fixed connection interval of 7.5ms was too short to allow the central device to schedule events, therefore resulting in packet collisions between the two boards. I also found that sending multiple notifications to the laptop as quickly as possible would overwhelm the event processing queue and cause it to fall behind in timing (similar to our struggles with the audio output). The solution was to raise the connection intervals to 20ms to allow for more schedulability between the devices, and to also raise the notification rates from 1ms up to 21 and 23ms, staggering them to further prevent queue congestion. This led to a much smoother response between the two drumsticks, and the safer approach did not seem to have a noticeable impact on performance.

One skill I’ve picked up while working on this capstone project is quickly reading through online documentation for relevant information. In order to make our Bluetooth, OpenCV and multithreaded audio modules cooperate, I’ve read everything from online tutorials by Nordic Semiconductor, web articles on Geeksforgeeks, and pure datasheets for our microcontrollers while problem solving. I’ve also learned to take as much input as possible from people with experience, such as the teaching staff and faculty, which has made the setbacks we’ve encountered much more manageable.

This week, I plan to help further optimize the system’s CV and Bluetooth processing. The problems we currently face are the HSV lighting inconsistencies along with a dip in performance when drumsticks are in view of the camera. I believe we’re still on track with our schedule, although we may be approaching significant design tradeoff decisions to be able to bring down the response time.

Belle’s Status Report for 11/30

This week, I focused on preparing the slides for the final presentation – incorporating the results from our verification and validation tests – and contributed to the drumstick detection portion of our project.

The former involved organizing and presenting the data in a way that highlights how our project meets its use case and design requirements, as well as practicing the general flow of how I would present the relevant tests (since there are many of them but there is not much time allotted for each presentation, so I have to be concise).

As for the drumstick detection, one key aspect of our design was the use of exponential weighting to account for latency when the video frame taken at the moment of an accelerometer impact did not reflect the correct position of the drumstick tip (i.e., it would show the drumstick tip as being in the previous drum’s boundary, rather than the drum that was actually hit). This was particularly a concern because of the potential delay between the moment of impact and the processing of the frame, as we were not sure what said latency would look like.

However, during further testing, we found that this issue was quite rare. The camera’s FPS was sufficiently high, and the CV processing latency was small enough that frames typically matched up with the correct impact timing. As a result, we found that exponential weighting was unnecessary for most scenarios. Additionally, the mutexes required to protect the buffer used for the calculation were introducing unnecessary and unwanted latency. In order to simplify the system and improve overall responsiveness, we scrapped the buffer and exponential weighting completely, which led to a noticeable reduction in latency and slightly smoother performance in general.

Previously, we also found a way to have the user tweak the hsv values themselves using several sliders and a visualizer and changed one of the drumstick tips from blue to red, so the relevant issues were solved. As a result, I feel as though the drumstick detection portion of the project is mostly done.

According to our gantt chart, I should still be working with Elliot and Ben to to integrate all of our individual components of our project, so I believe I am on track. Therefore, next steps include finalizing preparations for the presentation and continuing to troubleshoot the Bluetooth latency discrepancy between the drumsticks.

Team Status Report for 11/30

This week, our team mainly focused on implementing several validation and verification tests for our project, as well as finalizing our slides for next week’s Final Presentation.

Validation Tests

  • Drumstick Weight: Both drumsticks weigh 147g, well under the 190.4g limit.
  • Minimum Layout: The drum setup achieved a layout area of 1638.3cm², close to (but still under) the required 1644cm².
  • Drum Ring Detection: A 15mm margin was implemented to reduce overlapping issues, successfully scaling drum pad radii.
  • Reliable BLE Connection: At a 3m distance, all impacts were detected with no packet loss.
  • Correct Sound Playback: The system achieved an 89% accuracy for correct drum sound playback. This slightly missed the 90% target and will thus require refinement.
  • Audio Response: Average latency between drum impact and sound playback was 94.61ms, meeting the 100ms limit (despite a notable outlier).

Verification Tests

  • Connective Components: The drumsticks’ wires and tape weighed only 5g per drumstick, far below the 14.45g limit.
  • Latency: Measured latencies include:
    • BLE transmission: 0.088ms (RTT/2) – under 30ms requirement
    • CV processing: 33.2ms per frame – under 60ms requirement
    • Accelerometer processing: 5ms – under 5ms requirement

We were happy with the majority of these results, as they prove that we were indeed able to meet the constraints that we initially placed on ourselves for this project.

We still are facing a few challenges, however:  we have found that when both drumsticks are connected via Bluetooth, one often experiences noticeably higher latency than the other. The root cause is unclear and under investigation, so resolving this issue is our current/next major priority.

Nonetheless, we have incorporated the results of these tests into our presentation slides and will continue working to resolve the Bluetooth latency issue.

Belle’s Status Report for 11/16

This week, I mostly worked with Ben and Elliot to continue integrating & fine-tuning various components of DrumLite to prepare for the Interim Demo happening this upcoming week.

In particular, my main contribution focused on fine-tuning the accelerometer readings. To refine our accelerometer threshold values, we utilized Matplotlib to continuously plot accelerometer data in real-time during testing. In these plots, the x-value represented time, and the y-value represented the average of the x and z components of the accelerometer output. This visualization helped us identify a distinct pattern: each drumstick hit produced a noticeable upward spike, followed by a downward spike in the accelerometer readings (as per the sample output screenshot below, which was created after hitting a machined drumstick on a drum pad four times):

Initially, we attempted to detect these hits by capturing the “high” value, followed by the “low” value. However, upon further analysis, we determined that simply calculating the difference between the two values would be sufficient for reliable detection. To implement this, we introduced a short delay of 1ms between sampling, which allowed us to consistently measure the low-high difference. Additionally, we decided to incorporated the sign of the z-component of the accelerometer output rather than taking its absolute value. This helped us better account for behaviors such as upward flicks of the wrist, which were sometimes mistakenly identified as downward drumstick hits (and were therefore incorrectly triggering a drum sound to be played). Thus, we were able to filter out other similar movements that weren’t downward drumstick swipes onto the drum pad/a solid surface, further refining the precision and reliability of our hit detection logic.

To address lighting inconsistencies from previous tests, we acquired another lamp, ensuring the testing desk is now fully illuminated. This adjustment will significantly improved the consistency of our drumstick tip detection, reducing the impact of shadows and uneven lighting. While we are still in the process of testing this 2-lamp setup, I currently believe using a YOLO/SSD model for object detection is unnecessary. These models are great for complex environments with many objects, but the simplicity of our current setup — with (mostly) controlled lighting and focused object tracking — is key. Also, implementing YOLO/SSD models would introduce significant computational overhead, which we aim to avoid given our desired sub-100ms-latency use case requirement. Therefore, I would prefer for this to remain as a last-resort solution to the lighting issue.

As per our timeline, since we should be fine-tuning/integrating different project components and are essentially done setting the accelerometer threshold values, we are indeed on track. Currently, specifically picking an HSV value for each drumstick is a bit cumbersome and unpredictable, especially in areas with a large amount of ambient lighting. Therefore, next week,  I aim to further test drumstick tip detection under varying lighting conditions and try to simplify the aforementioned process, as I believe it is the least-solid aspect of our implementation at the moment. 

Elliot’s Status Report for 11/16

This week I mainly worked towards incorporating the second BLE drumstick to the system controller and fine-tuning our accelerometer thresholds upon integration with the rest of the code. After wiring the second ESP and accelerometer, I used serial output to find the new MAC address, and I dedicated the laptop’s connection functionality across separate threads for each of the boards. Once I was able to reliably connect to both of the sticks, I met with my group to test the new configuration. At that point, Ben had already spent some time deriving the required behavior for our MPU6050 output waveforms, but we hadn’t yet achieved an appropriate method for distinguishing hits on the table from swinging drumsticks in the air. After testing across various surfaces and static thresholds, I noticed that taking the magnitude of our data readings was not sufficient for identifying hit patterns–the sequence of high-magnitude accelerometer output to low-magnitude was too general to characterize a rebound off the playing surface, and instead triggered a high frequency of false readings while idle. I modified the notification handlers in my bluetooth code to attach the sign of the z-axis to our incoming data, thereby allowing us to identify upward and downward swings independently, and using the graph from our prior testing I was able to set up a sequence for detecting valid spikes. By polling for a sufficient negative reading, blocking with a delay, and then taking a subsequent reading, we were able to correctly identify downward hits given the two-point difference. One issue we found, however, was that while the two sticks operated well separately, one of them would suffer from a noticeable output delay when playing simultaneously. An important point to consider was that for any run, whichever ESP made the connection to the laptop second was the one that experienced the performance drop. This problem could be a result of the bluetooth connection or it could be a flaw in our threading architecture; I am hesitant to blame our BLE connection interval simply due to the fact that both microcontrollers are running the same firmware code, but I plan to resolve this issue within the next few days. Overall, I believe my progress and our team’s progress is on track, and this upcoming week I plan to meet with the team to flesh out the system’s HSV/lighting concerns as well as continue to test the CV module extensively.