Belle’s Status Report for 12/7

This week was mainly spent working on our final poster, as well as helping to integrate Elliot’s changes to the bluetooth code into the current codebase (as my final presentation on Monday went quite well, but I did claim multiple times that Elliot’s changes would fix most (if not all) of our dual-stick lag issues). I also created a couple of graphics for the poster and cleaned it up overall, referencing posters that did well in previous years so that our upcoming presentations can go as smoothly as possible.

Since we are still supposed to be integrating all of our components and making final changes/tweaks to our system, I believe we are on track. Next week, I hope to finish the final poster and practice my presentation skills for both the TechSpark expo and the 18500 showcase, as well as complete the final poster/video.

Ben Solo’s Status Report for 12/7

Following our final presentation on Monday, I spent this week working on  integrations, making minor fixes to the system all around, and working with Elliot to resolve out lag issues when using two drumsticks. I also updated our frontend to reflect the actual way our system is intended to be used. Lastly, I made a modification to the object detection code that prevents swinging motions outside of the drum  pads from triggering an audio event. I’ll go over my contributions this week individually below:
1.) I implemented a sorting system the correctly maps the detected drum pads to the corresponding sound via the radius detected. The webapp allows the user to configure their sounds based on the diameter of the drum pads, so its important that when the drum pads are detected at the start of  playing session, they are correctly assigned to loaded sounds, instead of being randomly paired based on the order they are encountered. This entailed writing a helper function that scales and orders the drums based on their diameters, organizing them in order of ascending radius.

2.) Elliot and I spent a considerable amount of time on Wednesday trying to resolve our two stick lag issues. He had written some new code to be flashed to the drumsticks that should have eliminated the lad by fixing an error we had initially coded where we were at time sending more data than the window could handle. E.g. we were trying to send readings every millisecond but only had a window of 20ms which resulted in received notifications being queued and processed after some delay. We had some luck with this new implementation, but after testing the system multiple times, we realized that the performance seemed to be deteriorating with time. We are now under the impression that this is a result of diminishing power in our battery cells, and are now planning on using fresh batteries to see if that resolves our issue. Furthermore, during these tests I realized that our range for what constitutes an impact was too narrow, which often times resulted in a perceived lag because audio wasn’t being triggered following an impact. To resolve this, we tested with a few new values, settling on a range of 0.3 m/s^2 to 20m/s^2 as a valid impact.

3.) Our frontend had a bunch of relics from our initial design which either needed to be removed or reworked. The main two issues were that the drum pads were being differentiated via color as opposed to radius, and that we still had a button saying “identify my drum set” on the webapp. The first issue was resolved by changing the radii of the circles representing the drum pads on the website and ordering them in a way that corresponds with how the API endpoint orders sounds and detected drum pads at the start of the session. The second issue regarding the “identify drum set” button was easily resolved by removing the button that triggers the endpoint for starting the drum pad detection script. The code housed in this endpoint is still used, but instead of being triggered via the webapp, we set our system up to run the script as soon as a playing session starts. I thought this design made more sense and made the experience of using our system much more simple by eliminating the need to switch back and fourth between the controller system and the webapp during a playing session. Below is the current updated frontend design:

4.) As it was prior to this week, our object tracking system had a major flaw which had previously gone unnoticed: when the script identified that the drum stick tip was not within the bounds of one of the drum rings, it simple continued to the next iteration of the frame processing loop, not updating our green/blue drum stick location prediction variables. This resulted in the following behavior.:
a.) impact occurs in drum pad 3.
b.) prediction variable is updated to index 3 and sound for drum pad 3 plays.
c.) impact occurs outside of any drum pad.
d.) prediction variable is not updated with a -1 value and thus plays the last know drum pad’s corresponding sound, i.e. sound 3
This is an obvious flaw that causes the system to register invalid impacts as valid impacts and play the previously triggered sound. To remedy this, I changed the CV script to update the prediction variables with the -1 (not in a drum pad) value, and updated the sound player module to only play a sound if the index provided ins in the range [0,3] (i.e. one of the valid drum pad indices). Our system now only plays a sound when the user hits one of the drum pads, playing nothing is the drum stick is swung our hit outside of a drum pad.

I also started working on our poster, which needs to be ready a little earlier than anticipated given our invitation to the Tech Spark Expo. This entails collecting all our graphs/test metrics and figuring out how to present then in a way that conveys meaning to the audience given our use case. For instance, I needed to figure out how to explain that a graph showing 20 accelerometer spikes verifies our BLE reliability within a 3m radius of our receiving laptop.

Overall, I am on schedule and plan on spending the coming weekend/week refining our system and working on the poster, video, and final report. I expect to have a final version of the poster done by Monday so that we can get it printed in time for the Expo on Wednesday.

Team Status Report for 12/7

In the past week our team made significant progress towards completing DrumLite. As was previously the case the one issue we are still encountering, which at this point is the only real threat to the success of our project, is related to laggy performance when using two drumsticks simultaneously.  Currently, Elliot has written and flashed new firmware to the ESP32’s which we very much hope to resolve the issue. We plan on testing and adjusting our implementation this weekend in preparation for the TechSpark Engineering Expo on Wednesday. Our progress is on schedule and in the coming days we aim to complete our entre integrated system, construct our poster, and start the demo video as to allow for sufficient time around demo day to prepare and write the final report. Additionally, we will conduct further tests highlighting tradeoffs we made for our system which will be included in our poster and final report.

The unit and system tests we’ve conducted so far are as follows:
1.) Ensure that the correct sound is triggered 90% of the time. This test was conducted by running our full system (CV, pad location, BLE, and audio playback) and performing hits on the 4 drum pads sequentially in the order 1, 2, 3, 4. For each impact, the expected audio and actual audio was recorded. Of the 100 samples taken, 89 impacts triggered the correct sound. While this is fairly close to the goal 90%, this led us to realize that we needed a tool allowing the user to select a specific HSV color range to accommodate for varying ambient lighting. Our new design prompts the user to tune the HSV range using HSV sliders and a live video feed of the color mask being applied at the start of each session.

2.) Verify that the overall system latency is below 100ms. This test was conducted by recording a video of performing a series of hits, then analyzing the video and audio recordings to determine when an impact occurred and when the sound was played. The difference between these two time stamps was recorded for each impact and the average was found to be 94ms. This average is however high due to one outlier in the data which had a latency of 360ms. Through further testing we realized that this was a result of our BLE module queueing data when the system was overloaded, and thus triggering an audio event whenever the impact was read from the queue. Elliot thus began implementing new firmware for the microcontrollers that reduces the connection interval and prevents the queuing of events in most cases.

3.) Verify that the latency of the CV module is less than 60ms per frame. This test was conducted by timing each iteration of our CV loop, each of which processes a single input frame. The data plots were plotted and the values average, resulting in a mean latency of 33.2ms per frame.

4.) verify BLE reliability within a 3m range of the user’s laptop. This test was conducted by recording accelerometer data for 20 impacts per drumstick at a distance of 3m from the laptop. By visually inspecting the graph of the accelerometer data, we were able to see that each of the 2o impacts was clearly identifiable and no data was lost.

5.) Verify that accelerometer read + data processing delay is sub 5ms. To conduct this test, we placed two timers in our code. One was placed just before the accelerometer data was read, and the other just before an audio event was triggered. About 100 readings were recorded, plotted, and average, resulting in an average delay of 5ms per impact

6.) Ensure less than 2% BLE packet loss. This test was conducted by creating counters which track packets sent ad packets received for each of the two drumsticks. By verifying that after execution, both counters match (or are within 2% of each other), we were able to ascertain a packet loss rate of ~2%.

7.) Verify BLE round trip latency below 30ms. TO conduct this test we started a timer on the host laptop, sent a notification to the ESP32’s, and measured the time at which the response was received at the host laptop. By averaging these recordings, we determined an average RTT of 17.4ms, meaning the one way latency is about 8.7ms.

8.) Weight requirements (drumsticks combined less than 190.4g and connective components below 14.45g). To verify these requirements we simply weighed the connective components (i.e. wires and athletic tape) as well as the entire drumsticks. The two drumsticks weighed a collective 147g and the connective components weighed 5.1g, both well below our requirements.

9.) Size requirements (minimum achievable layout area of 1644cm^2). To verify this, we configured the 4 drum pads in their minimum layout (all tangent to one another) and measured the height and width in their maximum dimensions. As expected, the layout measured 1644cm^2

Overall we are making good progress, are on schedule, and hope to complete our project this weekend so we can take our time in preparation for the Expo on Wednesday and get a head start on some of the final project submissions in the coming week.

Ben Solo’s Status Report for 11/30

Over the last week(s), I’ve focused my time on both creating a tool for dynamically selecting/tuning precise HSV values and conducting tests on our existing system in anticipation of the final presentation and the final report. My progress is still on schedule with the Gant Chart we initially set for ourselves. Below I will discuss the HSV tuning tool and tests I conducted in more detail.
As we were developing our system and testing the integration of our CV  module, BLE module, and audio playback module, we quickly realized that the approach of finding uniform and consistent lighting was pretty much unattainable without massively constraining the dimensions to which the drum set could be scaled. In other words, if we used a bunch of fixed lights to try and keep lighting consistent, it would a.) affect how portable the drum set is and b.) limit how big you could make the drum set as a result of a loss in uniform lighting the further the drum pads move from the light source. Because the lighting massively impacts the HSV values used to filter and detect the drum stick tips in each frame, I decided we needed a tool that allows the user to dynamically change the HSV values at the start of the session so that a correct range can be chosen for each unique lighting environment. When our system controller (which starts and monitors the BLE, CV, and audio threads) is started, it initially detects the rings locations, scales their radii up by 15mm and shows the result to the user so they can verify the rings were correctly detected. Thereafter, two tuning scripts start, one for blue and one for green. In each case, two windows pop up on the user’s screen, one with 6 sliding toolbar selectors and another with a live video feed from the webcam with the applied blue or green mask over it. In an ideal world, when the corresponding blue or green drumstick is held under the camera, only the colored tip of the drumstick should be highlighted. However, since lighting changes a lot, the user now has the ability to alter the HSV range in real time and see how this affects the filtering. Once they find a range that accurately detects the drum stick tips and nothing else, the user hits enter to close the tuning window and save those HSV values for the playing session.  Below is a gif  of the filtered live feed window. It shows how initially the HSV range is not tuned precisely enough to detect just the tip of the drumstick, but how eventually when the correct range is selected, only the moving tip of the drumstick is highlighted.

Following building this tool, I conducted a series of verification and validation tests on parts of the system which are outlined below:

1.) To test that the correct sound was being triggered 90% of the time I conducted the following tests. I ran our whole system and and played drums 1, 2, 3, 4 in that order repeatedly for 12 impacts at a time, eventually collecting 100 samples. For each impact, I recorded what sound was played. I then found the percentage of impacts for which the drum pad hit corresponded correctly to the sound played and found this value to be 89%.

2.) To verify that the overall system latency was below 100ms, I recorded a video of myself hitting various drum pads repeatedly. I then loaded the video into a video editor and split the audio from the video. I could then identify the time at which the impact occurred by analyzing the video and identify when the audio was played by finding the corresponding spike in the audio. I then recorded the difference between impact time and playback time for each of the impacts and found an average overall system latency of 94ms. While this is below the threshold we set out for, most impacts actually have a far lower latency. The data was skewed by one recording which had ~360ms of latency.

3.) To verify that our CV module was running in less than 60ms per frame, I used matplotlib to graph the processing time for each frame and found the average value to be 33.2ms per frame. The graph is depicted below. 

I conducted several other more trivial tests, such as finding the weight of the drumsticks, the minimum layout dimensions, and verifying that the system can be used reliably within a 3m range of the laptop, all of which yielded expected results as outlined in our use case and design requirements.

In response to the question of what new tools or knowledge I’ve had to learn to progress through our capstone project, Id say that the two technologies I had to learn about and learn how to implement were CV (via openCV and skimage), and audio playback streaming (via pyAudio). I had never worked with either of them before so it definitely took a lot of learning before I was able to implement any strong, working code. For CV, I’d say I probably learned the most from reading other people (especially Belle’s) initial CV code. Her code used all the techniques I needed to use for building my HSV range selecting tool as well as the module I wrote for initially detecting the locations and radii of the drum pads. I read through her code as well as various other forums such as stack overflow whenever I encountered issues and was able to learn all I needed in order to implement both of these modules. In the case of audio playback streaming, I’d say I learned it mostly through trial and error and reading on stack overflow. I probably went through 6 iterations of the playback streaming module before I found a solution with low enough playback speed. Because many other applications such as drum machines or electronic synthesizers encountered many of the same issues as I was when trying to develop an efficient playback  module, there was a large amount of information online, whether that be regarding using pyAudio streaming or overall concepts on low latency audio playback (such as preloading audio frames, or initializing audio streams prior to the session)

In the coming week the most pressing goal is to determine why playing with two connected drumsticks at once if resulting in sch a drop in performance. Once we figure out wat this issue is, I hope to spend my time implementing a system that can reliably handle two drumsticks at once. Additionally, I hope to start working on either the poster or video as to alleviate some stress in the coming two weeks before capstone ends.

Belle’s Status Report for 11/30

This week, I focused on preparing the slides for the final presentation – incorporating the results from our verification and validation tests – and contributed to the drumstick detection portion of our project.

The former involved organizing and presenting the data in a way that highlights how our project meets its use case and design requirements, as well as practicing the general flow of how I would present the relevant tests (since there are many of them but there is not much time allotted for each presentation, so I have to be concise).

As for the drumstick detection, one key aspect of our design was the use of exponential weighting to account for latency when the video frame taken at the moment of an accelerometer impact did not reflect the correct position of the drumstick tip (i.e., it would show the drumstick tip as being in the previous drum’s boundary, rather than the drum that was actually hit). This was particularly a concern because of the potential delay between the moment of impact and the processing of the frame, as we were not sure what said latency would look like.

However, during further testing, we found that this issue was quite rare. The camera’s FPS was sufficiently high, and the CV processing latency was small enough that frames typically matched up with the correct impact timing. As a result, we found that exponential weighting was unnecessary for most scenarios. Additionally, the mutexes required to protect the buffer used for the calculation were introducing unnecessary and unwanted latency. In order to simplify the system and improve overall responsiveness, we scrapped the buffer and exponential weighting completely, which led to a noticeable reduction in latency and slightly smoother performance in general.

Previously, we also found a way to have the user tweak the hsv values themselves using several sliders and a visualizer and changed one of the drumstick tips from blue to red, so the relevant issues were solved. As a result, I feel as though the drumstick detection portion of the project is mostly done.

According to our gantt chart, I should still be working with Elliot and Ben to to integrate all of our individual components of our project, so I believe I am on track. Therefore, next steps include finalizing preparations for the presentation and continuing to troubleshoot the Bluetooth latency discrepancy between the drumsticks.

Team Status Report for 11/30

This week, our team mainly focused on implementing several validation and verification tests for our project, as well as finalizing our slides for next week’s Final Presentation.

Validation Tests

  • Drumstick Weight: Both drumsticks weigh 147g, well under the 190.4g limit.
  • Minimum Layout: The drum setup achieved a layout area of 1638.3cm², close to (but still under) the required 1644cm².
  • Drum Ring Detection: A 15mm margin was implemented to reduce overlapping issues, successfully scaling drum pad radii.
  • Reliable BLE Connection: At a 3m distance, all impacts were detected with no packet loss.
  • Correct Sound Playback: The system achieved an 89% accuracy for correct drum sound playback. This slightly missed the 90% target and will thus require refinement.
  • Audio Response: Average latency between drum impact and sound playback was 94.61ms, meeting the 100ms limit (despite a notable outlier).

Verification Tests

  • Connective Components: The drumsticks’ wires and tape weighed only 5g per drumstick, far below the 14.45g limit.
  • Latency: Measured latencies include:
    • BLE transmission: 0.088ms (RTT/2) – under 30ms requirement
    • CV processing: 33.2ms per frame – under 60ms requirement
    • Accelerometer processing: 5ms – under 5ms requirement

We were happy with the majority of these results, as they prove that we were indeed able to meet the constraints that we initially placed on ourselves for this project.

We still are facing a few challenges, however:  we have found that when both drumsticks are connected via Bluetooth, one often experiences noticeably higher latency than the other. The root cause is unclear and under investigation, so resolving this issue is our current/next major priority.

Nonetheless, we have incorporated the results of these tests into our presentation slides and will continue working to resolve the Bluetooth latency issue.

Belle’s Status Report for 11/16

This week, I mostly worked with Ben and Elliot to continue integrating & fine-tuning various components of DrumLite to prepare for the Interim Demo happening this upcoming week.

In particular, my main contribution focused on fine-tuning the accelerometer readings. To refine our accelerometer threshold values, we utilized Matplotlib to continuously plot accelerometer data in real-time during testing. In these plots, the x-value represented time, and the y-value represented the average of the x and z components of the accelerometer output. This visualization helped us identify a distinct pattern: each drumstick hit produced a noticeable upward spike, followed by a downward spike in the accelerometer readings (as per the sample output screenshot below, which was created after hitting a machined drumstick on a drum pad four times):

Initially, we attempted to detect these hits by capturing the “high” value, followed by the “low” value. However, upon further analysis, we determined that simply calculating the difference between the two values would be sufficient for reliable detection. To implement this, we introduced a short delay of 1ms between sampling, which allowed us to consistently measure the low-high difference. Additionally, we decided to incorporated the sign of the z-component of the accelerometer output rather than taking its absolute value. This helped us better account for behaviors such as upward flicks of the wrist, which were sometimes mistakenly identified as downward drumstick hits (and were therefore incorrectly triggering a drum sound to be played). Thus, we were able to filter out other similar movements that weren’t downward drumstick swipes onto the drum pad/a solid surface, further refining the precision and reliability of our hit detection logic.

To address lighting inconsistencies from previous tests, we acquired another lamp, ensuring the testing desk is now fully illuminated. This adjustment will significantly improved the consistency of our drumstick tip detection, reducing the impact of shadows and uneven lighting. While we are still in the process of testing this 2-lamp setup, I currently believe using a YOLO/SSD model for object detection is unnecessary. These models are great for complex environments with many objects, but the simplicity of our current setup — with (mostly) controlled lighting and focused object tracking — is key. Also, implementing YOLO/SSD models would introduce significant computational overhead, which we aim to avoid given our desired sub-100ms-latency use case requirement. Therefore, I would prefer for this to remain as a last-resort solution to the lighting issue.

As per our timeline, since we should be fine-tuning/integrating different project components and are essentially done setting the accelerometer threshold values, we are indeed on track. Currently, specifically picking an HSV value for each drumstick is a bit cumbersome and unpredictable, especially in areas with a large amount of ambient lighting. Therefore, next week,  I aim to further test drumstick tip detection under varying lighting conditions and try to simplify the aforementioned process, as I believe it is the least-solid aspect of our implementation at the moment. 

Ben Solo’s Status Report for 11/16

This week I worked on determining better threshold values for what qualifies as an impact. Prior to this week we had a functional system where the accelerometer ESP32 system would successfully relay x, y, z acceleration data to the user’s laptop, and trigger a sound to play. However, we would essentially treat any acceleration as an impact and thus trigger a sound. We configured our accelerometer to align its Y-axis parallel to the drumstick, the X-axis to be parallel to the floor and perpendicular to the drumstick (left right motion), and the Z-axis to be perpendicular to the floor and perpendicular to the drumstick (up-down motion). This is shown in the image below:

Thus, we have two possible ways to retrieve relevant accelerometer data: 1.) reading just the Z-axis acceleration, and 2.) reading the average of the X and Z axis acceleration, since these are the two axis with relevant motion. So on top of finding better threshold values for what constitutes an impact, I needed to determine what axis to use when reading and storing the acceleration.  To determine both of these factors I ran a series of tests where I mounted the accelerometer/ESP32 system to the drumstick as shown in the image above and ran two test sequences. In the first test I used just the Z-axis acceleration values and in the second I used the average of the X and Z-axis acceleration. For each test sequence, I recorded 25 clear impacts on a rubber pad on my table. Before starting the tests, I did preformed a few sample impacts so I could see what the readings for an impact resembled. I noted that when an impact occurs (the stick hits the pad) the acceleration reading for that moment is relatively low. So for the tests purposes, an impact was identifiable by seeing a long chain of constant accelerometer data (just holding the stick in the air), followed by an increase in acceleration, followed by a low acceleration reading.
Once I established this, I started collecting samples, first for the test using just the Z-Axis. I performed a series of hits, recording the 25 readings where I could clearly discern an impact had occurred from the output data stream. Cases where an impact was not clearly identifiable from the data were discarded and that sample was repeated. For each sample, I stored the acceleration at the impact, the acceleration just prior to the impact, and the difference between the two (i.e. A(t-1) – A(t)). I then determined the mean and standard deviation for the acceleration, prior acceleration, and the difference between the two acceleration readings. Additionally, I calculated what I termed the upper bound (mean + 1 StdDev) and the lower bound (mean – 1 StdDev). The values are displayed below:

I repeated the same process for the second test, but this time using the average of the X and Z-axis acceleration. The results for this test sequence are shown below:

As you can see, almost unanimously, the values calculated from just the Z-axis acceleration are higher than when using the X,Z average. To then determine what the best threshold values to use would be I proceeded with 4 more tests:
1.) I set the system to use just the Z-axis acceleration and detected a impact with the following condition:

if accel < 3.151 and accel > 1.072: play sound

Here I was just casing on the acceleration to detect an impact, using the upper and lower bound for Acceleration(t).

2.) I set the system to use just the Z-axis acceleration and detected an impact with the following condition:

if (prior_Accel - Accel) < 5.249 and (prior_Accel - Accel) > 2.105: play sound

Here, I was casing on the difference of the previous acceleration reading and the current acceleration reading, using the upper and lower bounds for Ax(t-1) – Ax(t)

3.)  I set the system to use the  average of the X and Z-axis accelerations and detected an impact with the following condition:

if accel < 2.401 and accel > 1.031: play sound

Here I was casing on just the current acceleration reading, using the upper and lower bounds for Acceleration(t).

4.)  I set the system to use the average of the X and Z-axis accelerations and detected an impact with the following condition:

if (prior_Accel - Accel) < 5.249 and (prior_Accel - Accel) > 2.105: play sound

Here I cased on the difference of the prior acceleration minus the current acceleration and used the upper and lower bounds for Ax(t-1) – Ax(t).

After testing each configuration out, I determined two things:
1.) The thresholds taken using the average of the X and Z-axis accelerations resulted in higher correct impact detection than just using the Z-axis acceleration, regardless of whether casing on the (prior_Accel – Accel) or just Accel.

2.) Using the difference between the previous acceleration reading and the current one resulted in better detection of impacts.

Thus, the thresholds we are now using are defined by the upper and lower bound of the difference between the prior acceleration and the current acceleration (Ax(t-1) – Ax(t)), so the condition listed in test 4.) above.

While this system is now far better at not playing sounds when the user is just moving the drumstick about in the air, it still needs to be further tuned to detect impacts. From my experience testing the system out, it seems that about 75% of the instances when I hit the drum pad correctly register as impacts, while ~25% do not. We will need to conduct further tests as well as some trial and error in order to find better thresholds that more accurately detect impacts.

My progress is on schedule this week. In the coming week, I want to focus on getting better threshold values for the accelerometer so impacts are detected more accurately. Additionally, I want to work on the refinement of the integration of all of our systems. As I said last week, we now have a functional system for one drumstick. However, we recently constructed the second drumstick and need to make sure that the additional three threads that need to run concurrently (One controller thread, one BLE thread, and one CV thread) work correctly and do not interfere with one another. Make sure this process goes smoothly will be a top priority in the coming weeks.

Belle’s Status Report for 11/9

This week, we mainly focused on integrating the different components of our project to prepare for the Interim Demo, which is coming up soon.

We first successfully integrated Elliot’s bluetooth/accelerometer code into the main code. The evidence of said success was an audio response (a drum beat/sound) being triggered by making a hit motion with the accelerometer and ESP32 in-hand.

We then aimed to integrate my drumstick tip detection code, which was a bit more of a challenge. The main issue concerned picking the correct HSV/RGB color values with respect to lighting and the shape of the drumstick tip. We positioned the drumstick tip on the desk (which we colored bright red), in-view of the webcam, and took a screenshot of the output. I then took this image and used a HSV color picker website in order to get HSV values for specific pixels in the screenshot. However, because of its rounded, oval-like shape, we have to consider multiple shadow, highlight, and mid-tone values. Picking a pixel that was too light or too dark would cause the drumstick tip to only be “seen” sometimes, or cause too many things to be “seen”. For example: sometimes the red undertones in my skin would be tracked along with the drumstick tip, or the tip would only be visible when in the more brightly-lit areas of the table.

In order to remedy this issue, we are experimenting with lighting to find an ideal setup. Currently we are using a flexible lamp that clamps onto the desk that the drum pads are laid on, but it only properly illuminates half of the desk. Thus, we put in an order for another lamp so that both halves of the desk can be properly lit, which should make the lighting more consistent.

As per our gantt chart, we are supposed to be configuring accelerometer thresholds and integrating all of our code at the moment, so we are surely on track. Next week, I plan to look into other object/color tracking such as CamShift, Background Subtraction, or even YOLO/SSD systems in case the lighting situation becomes overly complicated. I also would like to work on fine-tuning the accelerometer threshold values, as we are currently just holding the accelerometer and making a hit-like motion rather than strapping it to a drumstick and hitting the table.

 

Team Status Report for 11/9

This week we made big strides towards the completion of our project. We incrementally combined the various components each of us had built into one unified system that operates as defined in our use case for 1 drum stick. Essentially, we have a system where when the drumstick hits a given pad, it triggers an impact event and plays the sound corresponding to that drum pad with very low latency. Note however that this is currently only implemented for 1 drum stick, and not both. That will be our coming week’s goal. The biggest risk we identified which we had not anticipated was how much variation in lighting affect the ability of the object tracking module to identify the red colored drum stick tip. By trying out different light intensities (no light, overhead beam light, phone lights, etc.) we determined that without consistent lighting the system would not operate. During our testing, every time the light changed, we would have to capture an image of the drum stick tip, find its corresponding HSV value, and update the filter in our code before actually trying the system out. If we are unable to find a way to provide consistent lighting given any amount of ambient lighting, this will severely impact how usable this project is. The current plan is to purchase two very bright clip on lamps that can be oriented and positioned to equally distribute light over all 4 drum rings. If this doesn’t work, our backup plan is to line each drum pad with LED strips so each has consistent light regardless of its position relative to the camera. The backup is less favorable because it would require that either batteries be attached to each drum pad, or that each drum pad must be close enough to an outlet to be plugged in, which deteriorates our portability and versatility goal defined in the use case.

The second risk we identified was the potential for packet interference when transmitting from two ESP32’s simultaneously. There is a chance that when we try and use two drumsticks, both transmitting accelerometer data simultaneously, the transmissions will interfere with one another resulting in packet loss. The backup plan for this is to switch to WIFI, but this would require serious overhead work to implement. Our hope is that since most of the time impacts from two drum sticks occur sequentially, the two shouldn’t interfere, but we’ll have to see what the actual operation is like this week to be sure.

The following are some basic changes to the design of DrumLite we made this week:
1.) We are no longer using rubber rings and instead using circular rubber pads. The reason for this is as follows. When we detect the drum pad’s locations and radii and use rings, there are two circles that could potentially bet detected: 1 being the outer circle and one being the inner circle. Since the ins no good way to tell the system which one to choose, we decided to switch to a drum pad instead where only 1 circle can ever be detected. Additionally, this also makes getting a threshold acceleration much easier since the surface being hit will now be uniform. This system works very well and the detection output is shown below:

2.) We decided to continually record the predicted drum ring which the drum stick is in throughout the playing session. This way, when an impact occurs, we don’t actually have to do any CV and can instead just perform the exponential weighting on the predicted drum rings to determine which pad was hit.

We are on schedule and hope to continue at this healthy pace throughout the rest of the semester. Below is an image of the whole setup so far: