Team Status Report for 11/4/23

This past week, we resolved a severe issue with the fret detection. This allowed us to integrate and verify fret detection from a software and hardware perspective. The pi hat PCB also arrived and was populated with components. The web interface was significantly improved with a visual indicator of the song.  We also discussed priorities moving forward, especially on the web app front, and gearing up for the early demo next week. On the embedded side, fret and strum detection code were written and verified, and a first pass at the user experience (training mode) was written.

The following picture shows the Teensy4.0 and RPi connected via the Pi-Hat PCB that came in earlier this week.

The following picture shows the electronics in a 3d-printed housing:

This image shows the latest concept for what the web app will display while the user experience is running:

This image shows we can detect the fret that the user pressed on (the corresponding LED lights up)

 

As for priorities, we have determined that everyone shift focus towards integrating their parts with everything else because we are at the point where many of our tasks span more than a single person’s expertise. We have also determined that adding scrolling notes to the web UI is not a top priority since it is merely a visual add-on, and other core functionality still needs to be implemented, such as implementing the rest of the GPIO signals on the RPI.

 

 

 

 

 

Ashwin’s Status Report for 11/4/2023

This week I have been making further progress on the web interface for the Superfret guitar. Now when you activate a guitar midi file, instead of just showing a pop-up indicating to the user that the song is active, the user is directed to an interactive page that visually displays the notes of the file that is being played. Here is how it works:

I developed an additional file called MidiFileReader.py which extracts the notes from a file that a user submitted and tokenizes them into an array of note objects which include the note value, time at which it is played, and the fret and string to play the note on. This way, I was able to modify the front end to process this array of tokenized notes and produce the moving blocks on the guitar at the right time and right place.

Retrieving the appropriate string and fret from the midi file note required me to develop an algorithm that could translate between the two. To do this, I created a function that takes in a midi_note, and a prev_midi_note as parameters. The algorithm checks each of the four strings to see if the midi_note is playable on that string. If so, it then compares its distance to the previous note using a simple distance formula. It then picks the option that is closest to the previous note. This ensures that the beginner guitarist will play sequences of notes that are close together and not far apart.

For next week, I would like to implement a pausing mechanism within the file playing. This would allow the midi file player to become interactive with the user as it will wait for the user to play the notes on the guitar before continuing.