This week, I worked on preparing for the interim demo. I refined my pitch detection to account for rests and ensure that the generated notes were accurate (i.e. earlier, some notes were incorrectly being marked as flat/sharp instead of natural). Then, I worked with Deeya to set up the Flat.io API, as we were running into several errors with authorizing and formatting sending/receiving requests and responses. However, we were able to figure out how to send our generated MIDI files to the API for processing into sheet music. Finally, Grace and I worked on ensuring compatibility between our code, and I finished modularizing all our existing code and integrating it into a single pipeline that gets triggered from the web app and runs in the backend. Pitch detection is mostly done, and for next steps, I will be working on:
- Tempo detection
- Setting up websockets for our webapp for real-time adjustment of the metronome + assisting Deeya with making the displayed sheet music editable
- Working with Grace to refine audio segmentation (ex: rests and incorporating Short-Time Energy for more accurate note duration detection)
I am also finding that when I incorporate the denoising step into the pipeline, the detected pitches are thrown off a bit, so I’ll have to look more into ensuring that the denoising step does not impact the pitch detection.