https://github.com/aakashsell/InSync
Final Project Video
Mathias’ Status Report 11/30
This week I focused on creating a functional UI that links to the backend. The UI has three major components a component that allows for the upload of a sheet music pdf to the backend, a component that allows for a user to play a song that they uploaded, and lastly a component that allows a users results from one of their sessions. The upload component is simply a button that allows a user to select a file from their device. The play song and view session component are separate pages that have a drop-down to allow users to select a song/result respectively.
I also spent time fixing a bug in the highlighting logic. Previously the logic always assumed that a quarter note was one beat however this wasn’t always the case so I changed this to be based on the value in the time signature.
In regards to testing I was able to quantify the accuracy of the sheet music scanning more formally. For a intermediate piece such as Fly Me to the Moon the sheet music scanning is at 99% accuracy for both timing onset and pitch detection.
In terms of new knowledge I learned about the basics of OpenCV as well as some general music knowledge. My learning method for most of the new content I had to learn was to learn incrementally meaning I would only learn about what I would need at the time. If I needing a specific piece of functionality from OpenCV or Audiveris instead of front loading all the learning I would just learn about that component I need.
Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?
The project is on track for the final presentation.
What deliverables do you hope to complete in the next week?
Polishing up the UI and feedback as well as working to better integrate all the project components.
Mathias Status Report for 11/16
This week I polished the post performance feedback as well as created an endpoint for a user to upload a piece of sheet music, running the sheet music conversion during the upload.
For the post performance feedback, I extended it such that the feedback would work if the sheet music was multiple pages. I did this by maintaining a global beat counter in each element of the position list. I also made the highlighting more accurate. Before the y-position of the highlighting was based off the y-position of the corresponding note. Now the y-position is based on the y-position of the staff lines. This helps since previously if the y-positions of the start and end note were similar the box around that area would be very thin which is no longer an issue. I also updated how I do the highlighting from a simple rectangle around the area to a proper highlight . This was done by adding a weighted yellow numpy array to the specified area of the image.
Example of changes(Note in the test I ran for this two areas were out of sync)
For the upload it takes in a file from a request which is then saved locally. A process running audiveris is then run on the file to produce the mxl and omr files which are then moved to a designated folder. These files are then unzipped and deleted.
For validation I have mostly been testing these components with Postman to simulate API requests. I’ve also created a short python script to simulate the input from the timing algorithm for the post performance feedback section. So far the tests have been generally successful with both the post performance and file upload however most of my tests have been happy case tests and more tests need to be done to see how the system handles failure and unexpected inputs.
Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?
The project is on track for the interim demo this upcoming week.
What deliverables do you hope to complete in the next week?
Time will largely be spent working towards the demo and then I will continue work on the UI.
Mathias’ report for 11/9
I spent this week integrated the post performance feedback component into an API which would allow me to call it from the web application. I created the API using flask and set it up as a POST endpoint which takes in a song name in the request. On receiving a request I then spawn a process to handle receiving data from the timing algorithm and marking the sheet music. The process uses regular python sockets to receive timing information
Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?
On Schedule
What deliverables do you hope to complete in the next week?
I hope to polish the UI and link this functionality to the UI