This week, I focused on being able to write the detected rhythm/pitch information to a MIDI file and also looked into APIs for displaying the generated MIDI information as sheet music. Using the pitch detection I did last week, I wrote another script that takes in the MIDI note numbers and note types and creates MIDI messages. Each note is associated with a message that encodes its frequency, duration, and loudness, and the script generates a .mid file with all the notes and their corresponding attributes. I tested this on a small clip of Twinkle, Twinkle Little Star; for the generated .mid file, I then uploaded this to the music notation platform, flat.io to see if the .mid file contained the correct notation. Below is the generated sheet music. For now, all the note pitches were generated by my pitch detection script, but all the notes are hard coded as quarter notes for now as our rhythm detection is in progress. The note segmentation –> pitch detection –> MIDI generation pipeline seems to be generating mostly correct notes for basic rhymes like Twinkle Twinkle.
Earlier this week, I also did some research into APIs that we could use to display the generated sheet music on our web application in a way that is similar to MuseScore, a popular music notation application. While MuseScore doesn’t have an API that we can use, flat.io has a developer guide that will allow us to display the generated sheet music. Next week, I will be looking more into the developer guide and working with Deeya to set up/integrate the Flat API onto our web app. I will also work with Grace to refine/test our note segmentation more and ensure it is accurate for other notes and rests. We will also potentially be meeting one of the flutists this week so that we can collect more audio samples as well. Overall, my progress is on schedule, and hopefully we will have our transcription pipeline working on simple audio samples for our interim demo.