After the live testing session with flutists on Sunday, we found that when performing two-octave scales, certain higher octave notes were still being incorrectly detected as belonging to the lower octave; after making some adjustments to the HSS pitch detection, they seem to be working correctly now. I also modified the MIDI encoding logic to account for rests. On the web app side, I worked with Deeya to incorporate time and key signature user inputs, and our webapp now supports past transcriptions as well. We also expired ways we could make the sheet music editable directly within the webapp. Since Flat.io API only supports read-only display with the basic subscription and we still have not heard back regarding access to the Premium version, we are planning to redirect users to the full editor in a separate window for now. Finally, I worked on the final presentation that is scheduled for next week.
In terms of the tools/knowledge I’ve picked up throughout the project:
- I learned to implement signal processing techniques/algorithms from scratch. Along the way, I learned a lot about pitch detection and what the flute signal specifically looks like and how we can use its properties to identify notes
- Web app components such as websockets and integration with APIs like Flat.io
- Familiarity with collaborative software workflows with version control, documenting changes clearly, and building clean/maintainable web interfaces with atomic design. We encountered some technical debt in our codebase, so a lot of time also was spent in refactoring for clarity and maintainability
- Conducting user testing for our project and collecting data/feedback to iterate upon our design