Shravya’s Status Report for 12/07/2024

This week, significant effort was spent implementing UART communication between the STM32 microcontroller and an external system, such as a Python script or serial terminal, to control solenoids. The primary goal was to parse MIDI commands from the Python script, transmit them via UART, and actuate the corresponding solenoids based on the received commands.

  1. Description of Code Implementation:
    • The UART receive interrupt (HAL_UART_RxCpltCallback) was set up in main.c to handle incoming data byte by byte and process commands upon receiving a newline character (\n).
    • Functions for processing UART commands (process_uart_command) and actuating solenoids (activate_solenoid and activate_chord) were written and tested.
    • The _write() function was implemented to redirect printf output over UART for debugging purposes.
  2. Testing UART Communication:
    • Python script (send_uart_data) confirmed successful transmission of parsed MIDI commands. See screenshot. This implies my computer is sending the data correctly but the STM32 is not receiving it properly.
    • Minicom and other terminal tools were used to TRY to verify whether UART data was received on the STM32 side. They did not work because I can’t monitoring/”occupy” the port without inhibhiting the data being sent. It seems that sending data and monitoring data on that port are mutually exclusive. This makes sense, but I saw online that monitoring a port with a serial terminal was a common way that people debug communication protocols. I also don’t see any point in monitoring a port other than the one the communication is occurring on.
  3. Unexpected and oddly specific solenoid activation:
    • Observed that solenoids 5 and 7 actuated unexpectedly at program startup. This happened multiple times when I reflashed. Also, the movements were somewhat intricate and oddly specific. Solenoids 5 and 7 SIMULTANEOUSLY turned on, off, on (for shorter this time), off, on and off. This makes it seem as if I hardcoded a sequence of GPIO commands for solenoids 5 and 7 which I most definitely did not.
    • Added initialization code in MX_GPIO_Init to set all solenoids to an off state (GPIO_PIN_RESET).
    • Temporarily disabled HAL_UART_Receive_IT to rule out UART-related triggers but found solenoids still actuated, indicating the issue may not originate from UART interrupts.

I have been debugging this for a week. Initially, there were a few mistakes I genuinely did make (some by accident/oversight, some by genuine conceptual misunderstanding): I realised I had to explicitly flush my RX buffer, create a better mapping system from my python code which deals with an octave of notes that are called “60-72” (because this is MIDI’s convention) and I had to map them to the 0-12 numbering system I use for my solenoids in the firmware code. I also noticed one small mismatch in the way I mapped each solenoid in the firmware code to an STM32 GPIO pin. Also, the _write() function was implemented to redirect print() over UART for debugging purposes.

I now feel like I have a good conceptual understanding of UART, the small portion of autogenerated code that stm32cube ide generates (this is inherent to the way stm32cube ide does some clock configurations, it’s necessary and there is no way to stop this), and any additional functions I’ve written. I definitely knew far less when I started testing out my uart code a week ago. Yet, I am still stuck. I might be out of ideas on what component of this code to look at next. I’ve now shared my files with Peter who may be a fresh set of eyes, as well as a friend of mine who is a CS major. I have an exam on Monday, but after it ends, I will work on this in HH1307 with my friend until it functions. I need it to work before our poster submission on Tuesday night and video submission on Wednesday night.

Peter’s Status Report from 12/07/2024

This Week

This week was spent creating the mount to hold the solenoids in the case. The first iteration (not pictured) had too low a casing at the base, so the walls were extended upwards by 3mm. The holes for the mount in the back were also too low, so they were moved up 2mm, which matched perfectly with the solenoid when printed.

 Figure 1: Solenoid Bracket

 

The next step, which will be done Sunday, is to space 13 of these in a way that will match up with the piano and attach them all in one case.

 

Next Week

Next week will be spent finalizing the case, doing testing, and completing the deliverables for the Capstone Course.

Team Status Report for 12/07/2024

See Shravya’s individual report for more details relating to UART debugging progress; in summary, while all UART-related code has been written and run for about a week now, debugging is still underway and taking much longer time than estimated. This bottlenecks any accuracy and latency testing Shravya can conduct with the solenoids playing any song fed in from the parser (accuracy and latency of solenoids behave as expected when playing a hardcoded dummy sequence though). Shravya hopes to get a fully-working implementation by Monday night (so there is ample time to display functionality in the poster and video deliverable), and conduct formal testing after that. She has arranged to meet with a friend who will help her debug on Monday. 

Testing related to hardware:

As a recap, we have a fully functional MIDI-parsing script that is 100% accurate at extracting note events. We also are able to control our solenoids with hardcoded sequences. The final handshaking that remains is connecting the parsing script to the firmware to allow the solenoids to actuate based on any given song. Once the UART bugs are resolved, we will feed in manually-coded MIDI files that encompass different tempos and patterns of notes. We will observe the solenoid output and keep track of the pattern of notes played to calculate accuracy, which we expect to be 100%

  • During this phase of the testing, we will also audio record the output with a metronome playing in the background. We will manually set the timestamps of each metronome beat and solenoid press, and use those to calculate latency. 

Some tests that have been completed are overall power consumption and ensuring functionality of individual circuit components:

  • Using a multimeter, we measured the current draw of solenoids under the 3 possible different scenarios. Obviously, the maximum power consumption occurs when all solenoids in a chord are actuated simultaneously, but even then we stay right under our expected power limit of 9 Watts.
  • To ensure the functionality of our individual circuit components, we conducted several small-scale tests. Using a function generator, we applied a low-frequency signal to control the MOSFET and verified that it reliably switched the solenoid on and off without any issues. For the flyback diode, we used an oscilloscope to measure voltage spikes at the MOSFET drain when the solenoid was deactivated. This allowed us to confirm that the diode sufficiently suppressed back EMF and protected the circuit. Finally, we monitored the temperature of the MOSFET and solenoid over multiple switching cycles to ensure neither component overheated during operation.

Shravya’s MIDI parsing code has been verified to correctly parse any MIDI file, either generated by Fiona’s UI or generated by external means, and handles all edge cases (rests and chords) which caused troubles previously.  

Testing related to software:

Since software integration took longer than expected, we are still behind on the formal software testing. Fiona is continuing to debug the software, and plans to start formal testing on Sunday (see more in her report). For a reminder of our formal testing plans, see: https://course.ece.cmu.edu/~ece500/projects/f24-teamc5/2024/11/16/team-status-report-for-11-16-2024/. We are worried that we might have to restrict these testing plans a little bit, specifically by not testing on multiple faces, due to the fact that many people are busy with finals, but we will do our best to have a complete idea of the functionality of the system. One change we know for certain we will make is that our ground truth for eye-tracking accuracy will not be based on camera play-back, but on which button the user is directed to look at, for simplicity and to reduce testing error.

Last week, Peter did some preliminary testing on the accuracy of the UI and eye-tracking software integration in preparation for our final presentation, and the results were promising. Fiona will continue that testing this week, and hopefully will have results before Tuesday in order to include them in the poster.

Fiona’s Status Report for 12/07/2024

Book-Keeping

I made a list of the final elements we have to wrap up in the last two weeks of the semester, and met with Shravya and Peter to assign tasks.

New Software Functionalities

I adjusted the software so that the user could open an existing composition without opening an external window. In order to do so, I implemented some logic where the program opens the next available file and loops back around to the first if it reaches the end of the available songs in the folder. Again, this obviously has some drawbacks in that the user may have to iterate through many compositions before reaching the end of the list, but it is accessible and simple, two of the most important foundations of our UI.

Another quick fix I made was to make the buttons different colors, at Peter’s suggestion, in order to make it easier for the users to differentiate between the different buttons and memorize different commands easier. Peter also mentioned that it would be good if there was a delay between calibration and the eye-tracking beginning right away so the user could orient themselves to the UI, so I also added in a short delay for that.

I also created a separate file that users could easily adjust sampling delay, gain and number of iterations for confirmation for the eye-tracking, in order for the user to be able to adjust those things as best suits them. 

I had previously identified a bug in which only the first page of sheet music would appear on the secondary UI, even if the cursor was on another page. I thus added two more buttons on the screen so the user could move back and forth between the pages of the sheet music. I verified that this worked with sample compositions from this repository: [1]. (That were greater than one page).

I also made it so that the rest and stack buttons would be highlighted when they were selected but before they were performed (so the user is aware of them).

Finally, since I have been having so much trouble with highlighting the cursor location on sheet music, mainly due to the variable locations of any one cursor location depending on any sharps/flats and preceding notes, etc., I decided to show the user the current cursor position on the main UI as a number (e.g. “Cursor at note 1.”). This was not our original plan, but I still believe it to be a viable solution to ensure the user knows where in the piece they are.

Debugging

I fixed a small bug where the C sharp note was highlighted when the high-C note was confirmed and another small bug in which the number of notes did not reset to 0 when a new file was opened.

Then, I did some stress testing to confirm that the logic used to build rests and chords up to three notes was completely sound. While testing the eye-tracking, I had been running into some situations in which it appeared that the logic broke, but I could not figure out why. I used a different Python script that would run the command responses on key press (rather than eye-press) to easily test this functionality. Here were the edge case bugs I identified and fixed:

  • When a rest is placed after a single eighth note, the eighth note is extended to a quarter note. I discovered with pretty certain confidence that this is actually a bug in the open-source sheet-music generator that we are using [1], because when I would play the composition in GarageBand, the eighth notes would play where expected even though the sheet music generator was interpreting them as quarter notes. I further verified by using another online sheet music generator, which also produced the expected value: https://melobytes.com/en/app/midi2sheet. Since I have already been struggling to modify this open-source program to highlight the cursor location and this is a very specific edge case, I decided it would be a better use of my time to not try to fix that bug within the program, and instead focus on our code. I have left a note in the README identifying the bug to users. 
  • Removing chords did not work successfully. I fixed this bug and verified it did not create a bug for single-note removal. Also, I verified that chords could be removed from anywhere in the piece, not just the end.
  • There were some logic errors with chords meant to have rests before them, which I fixed. I also optimized the three note chord logic to avoid bugs, although I hadn’t identified any yet.

Next Week

After double-checking that the eye-coordinates, of which there have been some discrepancies, I plan to start performing the latency, accessibility and accuracy tests on the software. As a group, we will work on the poster and video next week so my primary goal is to finish software testing ASAP for those deliverables, but I will also work on other elements of them as needed. Additionally, I have to collaborate with Shravya once her UART code is working to integrate our two systems, but I have already set up some code for that so I do not anticipate that taking too long. For the final report, I will work on Use-Case & Design Requirements, Design Trade Studies (UI), System Implementation (UI), Testing (Software), and Related Works sections.

References

[1] BYVoid. (2013, May 9) MidiToSheetMusic. GitHub. https://github.com/BYVoid/MidiToSheetMusic