Deeya’s Status Report for 03/29/25

This week I was able to show sheet music on our web app based on the user’s recording or audio file input. Once the user clicks on Generate Music, the web app is able to call the main.py file that integrates both Shivi and Grace’s latest rhythm and pitch detection algorithms, generate a midi file and store it, convert it into MusicXML, make an API POST request to Flat.io to then display the generated sheet music. The integration process is pretty seamless now so whenever there are more changes made in the algorithms it is easy to integrate the newest code with the web app and have it functioning properly.

In the API part once I convert the MIDI to MusicXML, I use the API method to create a new music score in the current User account. I send a JSON of the title of the composition, privacy (public/private), and data which is the MusicXML file:

new_score = flat_api.ScoreCreation(
   title=’New Song’,
   privacy=’public’,
   data=musicxml_string
)

This then creates a score and an associated score_id, which enables the Flat.io embedding to place the generated sheet music into our web app:

var embed = new Flat.Embed(container, {
   score: scoreId,
   embedParams: {
      mode: ‘edit’,
      appId: #,
      branding: false,
      controlsPosition: ‘top’.
    }
});

Flat.io has a feature that allows the user to make changes to the generated sheet music including the key and time signatures, notes, and any articulations and notations. This is what I will be working on next, which then should leave good amount of time for fine-tuning and testing our project with the SOM students.

Leave a Reply

Your email address will not be published. Required fields are marked *