Team Status Report for 3/29

This week, we made progress toward integrating the audio processing system with the game engine. We implemented a custom JSON parser in C++ to load beatmap data generated by our signal processing pipeline, enabling songs to be played as fully interactive game levels. We also added a result splash screen at the end of gameplay, which currently displays basic performance metrics and will later include more detailed graphs and stats.

In the editor, we refined key systems for waveform visualization and synchronization. We improved timeToPixel() and pixelToTime() mappings to ensure the playhead and note grid align accurately with audio playback. We also advanced snapping logic to quantize note timestamps based on BPM and introduced an alternative input method for placing notes at specific times and lanes, in addition to drag-and-drop.

On the signal processing side, we expanded rhythm extraction testing to voice and bowed instruments and addressed tempo estimation inconsistencies by using a fixed minimum note length of 0.1s. We also updated the JSON output to include lane mapping information for smoother integration.

Next week, we plan to connect the main menu to gameplay, polish the UI, and fully integrate all system components.

Lucas’ Status Report for 3/29

This week I focused on finishing integration of my game with the signal processing by adding a functioning JSON parser that will allow for the output from the music analysis to be played as a game. I also added a result splash screen that displays at the end of each game, with a few more detailed statistics and graphs that I plan to add to later.

Next week I’d like to finish integration to allow for a gateway between the main menus and the game itself, ensuring that each song saved by the user can be clicked on and played. I’d also like to clean up some UI and make the game more visually appealing.

Michelle’s Status Report for 3/29

This week, I continued finetuning the audio processing algorithm. I continued testing with piano and guitar and also started testing voice and bowed instruments. These are harder to extract the rhythm from since the articulation can be a lot more legato. If we used pitch information, it may be possible to distinguish note onsets in slurs, for example, but this is most likely out of scope for our project.

Also, there was a flaw in calculating the minimum note length based on the estimated tempo because sometimes a song that most people would consider 60 BPM, librosa would estimate 120 BPM, which is technically equivalent, but then the calculated minimum note length would be much smaller and result in a lot of “double notes”, or note detections directly after one another that resulted from one more sustained note. For the game experience, I believe it is better to have more false negatives than false positives. I think having a fixed minimum note length will be a better generalization. A threshold of 0.1 seconds seems to work well.

Additionally, In preparation to integrate the music processing with the game, I added some more information to the JSON output that bridges the two parts. Based on the number of notes in for a given timestamp, the lane numbers are randomly chosen from which the tiles will fall from.

Example JSON output

My progress is on schedule. Next week, I plan to finalize my work on processing the rhythm of single-instrument tracks and meet with my teammates to integrate all of our subsystems together.

Yuhe’s Status Report for 3/29

This week, I focused on implementing and debugging the waveform viewer and its synchronization with the audio playback system. I refined the timeToPixel(float timestamp) and pixelToTime(int x) functions to ensure accurate coordinate transformation between time and screen space, allowing the playhead and notes to sync with real-time audio. I also began implementing the snapping grid logic by quantizing timestamps to the nearest beat subdivision (e.g., 1/4, 1/8) based on BPM and audio offset. For note placement, I encountered difficulties with SFML’s event system for implementing smooth drag-and-drop behavior. As an alternative, I developed an input-based method that allows users to specify the number of notes, their lane indices, and timestamps via a dialog box, which then injects the notes into the grid. These changes aim to improve precision and editor usability. However,  it is still very challenging to ensure waveform-playhead alignment under various zoom levels.

Team Status Report 3/22

The team has all been progressing individually on their respective parts of the game over the past week. Michelle has been continuing work on the audio processing side, deciding to focus on perfecting the processing of monophonic piano pieces while Lucas and Yuhe are continuing work on the game itself. I’ve continued work on the core game loop, adding back many of the key features while implementing a JSON parser to turn processed audio into a game, and Yuhe has begun working on the in game beat map editor while also adding some neat visual tools to the game.

We haven’t had any major changes to our game’s design, other than the fact that the audio processing side will focus more on monophonic pieces than longer and more dense stuff. This will probably be our biggest challenge going forward in the coming weeks; as long as we are able to integrate the game itself and the beat map editor/menus, we should be able to devote more time and resources to figuring out more complex audio processing.

Yuhe’s Status Report for 3/22

This week I worked on designing and implementing key features of the Beatmap Editor for our game. In the first half of the week, I worked on the layout system, structuring the editor interface to accommodate a waveform display area, time axis, note grid overlay, and playback controls. I integrated JSON-based waveform data preprocessed externally to visualize amplitude over time using SFML’s VertexArray class. In the latter half, I worked on the note grid system that allows users to place notes along multiple lanes. Each lane corresponds to a specific input key, and note placement is quantized based on timestamp intervals to ensure rhythm accuracy. A snapping mechanism aligns notes to the closest beat subdivision, improving usability.

I think the major challenge is synchronizing the visual playhead with the music playback and managing coordinate transforms between waveform time and screen space. My goals for next week include implementing audio playback controls and note data serialization.

Lucas’ Status Report for 3/22

This week, I continued to make the game into an actual game utilizing the new engine. Like I (maybe) said last week, the engine doesn’t cover much of what Unity did, which means there’s a lot more tedious work related to drawing and keeping track of data structures containing info about game elements. I was able to add back things like timing, scoring, tracking multiple note blocks at the same time, and the beginnings of a JSON parser that will generate the actual beat map.

The game looks and feels a lot more like an actual game now, barebones as it is. Next week, I’ll finish up the JSON parser to fully integrate the signal processing aspect into the game and add more visual feedback to the game/UI elements that should make it more engaging for the player. I’ll also add a gateway between the main menus and the game itself, allowing for more seamless transitioning between parts of the game.

Michelle’s Status Report for 3/22

This week I continued testing my algorithm on monophonic instrumental and vocal songs with fixed or varying tempo. I ran into some upper limits with SFML in terms of how many sounds it can keep track of at a time. For longer audios, when running the test, both the background music and the generated clicks on note onsets will play perfectly for about thirty seconds before the sound starts to glitch and then goes silent and produces this error:

It seems that there is an upper bound of SFML sounds that can be active at a time and after running valgrind it looks like there are some memory leak issues too. I am still debugging this issue, trying to clear click sounds as soon as they are done playing and implementing suggestions from forums. However, this is only a problem with testing as I am trying to play probably hundreds of metronome clicks in succession, and will not be a problem with the actual game since we will only be playing the song and maybe a few sound effects. If the issue persists, it might be worthwhile to switch to a visual test. This will be closer to the gameplay experience anyway.

Next week I plan to try to get the test working again, try out a visual test method, and work with my team members on integration of all parts. Additionally, after having a discussion with my team members, we think it may be best to leave more advanced analysis of multi-instrumental songs as a stretch goal and focus on the accuracy of monophonic songs for now.

Team Status Report for 3/15

This week, we each made a lot of progress on our subsystems and started the integration process. Yuhe finished building a lightweight game engine that will much better suit our purposes than Unity, and implemented advanced UI components, a C++ to Python bridge, and a test in C++ for rhythm detection verification using SFML. Lucas worked on rewriting the gameplay code he wrote for Unity to work with the new engine, and was able to get a barebones version of the game working. Michelle worked on rhythm detection for monophonic time-varying tempo songs, which is quite accurate, and started testing fixed tempo multi-instrumental songs, which needs more work.

Beat Detection for Time-Varying Tempo
Core Game Loop in New Game Engine

There have been no major design changes in the past week. The most significant risk at this time to the success of our project is probably the unpredictability of the audio that the user will upload. Our design will mitigate this risk by only allowing certain file types and sizes and surfacing a user error if no tempo can be detected (i.e. the user uploaded an audio file that is not a song).

Next steps include finishing the transition to the new game engine, refining the rhythm detection of multi-instrumental songs, and implementing an in-game beatmap editor. With integration off to a good start, the team is making solid progress towards the MVP.

Yuhe’s Status Report for 3/15

This week, I finished implementing the game engine and UI components for Rhythm Genesis. I improved our existing UI elements—including buttons, textboxes, sliders, labels, and the UI manager—by refining their appearance and interactivity. In addition, I implemented advanced UI components such as a scrollable list and an upload UI to handle audio file uploads.

I also developed a custom C++ to Python bridge, which allows us to call Python module functions directly from our C++ game engine. This integration enabled me to implement a basic parse_music_file function in Python using the Librosa library. Additionally, I created a function that plays both the original song and a beep sound generated based on the extracted beat timestamps, which is useful for testing our beat detection.

On the UI side, I completed the menu scene and upload scene, and I set up placeholders for the settings and beat map editor UI. These components provide a solid foundation for our user interface and will be further integrated as the project progresses.

On Friday, I conducted our weekly meeting with the team as usual. I demonstrated the current state of the game, showing the UI scenes and explaining my implementation of the game engine. I also helped team members set up their development environments by ensuring all necessary dependencies were installed. During the meeting, I discussed with Lucas how he might leverage our existing UI components—or use the SFML sprite system—to implement the game scenes and scoring system. I also took the opportunity to test Michelle’s audio processing algorithms within the game engine.

Overall, it was a productive week with substantial progress in both UI development and audio processing integration, setting us on a good path toward our project goals. For the next few weeks I will be working on implementing the beatmap editor for our game.