Michelle’s Status Report for 3/8

This week, I worked on creating an algorithm for determining the number of notes to be generated based on the onset strength of the beat. Onset strength at time t is determined by max(0, S[f, t] – ref[f, t – lag]) where ref is S after local max filtering along the frequency axis and S is the log-power Mel spectrogram.

Since a higher onset strength implies a more intense beat, it can be better represented in the game by chords. Likewise, a weaker onset strength would generate a rest or a single notes. Generally we want more single notes than anything else, with three note chords being rarer than two note chords. These percentiles can be easily adjusted later on during user testing to figure out the best balance.

My progress is on schedule. Next week, I plan to refactor my explorations with Librosa into modular functions to be easily integrated with the game. I will also be transitioning from working on audio analysis to working on the UI of the game.

Team Report for 3/8

During the week leading up to spring break, our team worked on both the documentation and development of Rhythm Genesis. We completed the design review report collaboratively; On the development side, we transitioned from Unity to our own custom C++ engine with SFML, optimizing performance for low-power systems. Yuhe implemented sprite rendering, UI elements, and settings management, ensuring smooth interaction and persistent data storage. Lucas worked on refining the core game loop, adding a scoring system, and transitioning from randomly generated notes to a JSON-based beat map system. Michelle worked on creating an algorithm for determining the number of notes to be generated based on the onset strength of the beat.

Next steps include expanding gameplay mechanics, integrating Python’s librosa for beat map generation, and improving UI design. The team will also focus on fully integrating Lucas’ contributions into the new engine to ensure a seamless transition.

Part A: Global Factors (Michelle)

Since our rhythm game does not analyze the lyrics of any songs and just analyzes beats, onset strengths, and pitches, any song in any language will work with our game. The only translations needed to globalize our game will be that of the text in the menu. Additionally, we plan to publish the game for free on Steam, which is available in 237 countries, so our game will be widely available internationally.

Part B: Cultural Factors (Lucas)

While our game is, at the end of the day, just a game, it will address a couple cultural factors – namely tradition (specifically music) and language. Our game will allow users to play a game that caters to their specific heritage as it allows them to use the music that they like. Instead of typical rhythm games that cover a small subset of music and therefore likely only represent a limited number of cultural backgrounds, our game will allow users to upload whatever music they’d like, which means that users from any background can enjoy the music that they feel represents them while playing the game.

Part C: Environmental Factors (Yuhe)
Good news—our game doesn’t spew carbon emissions or chop down trees! Rhythm Genesis is just code, happily living on user’s device, consuming only as much electricity as their computer allows. Unlike some bloated game engines that demand high-end GPUs and turn one’s laptop into a space heater, our lightweight C++ engine keeps things simple and efficient. Plus, since a player can play with their keyboard instead of buying plastic peripherals, we’re technically saving the planet… So, while our game can’t plant trees, it is at least not making things worse. Play guilt-free—unless you miss all the notes, then that’s on you 🙂

Yuhe’s Status Report for 3/8

During the week before the spring break, I worked on both the documentation and development of our rhythm game. I did the System Implementation, Testing, and Summary sections of our Design Review Report, detailing the architecture, subsystems, and testing methodologies.

Additionally, I focused on implementing our custom game engine in C++ using SFML, replacing Unity due to its high computational demands and uncessary packages and features that we will not use for our 2D game. I implemented sprite rending system and core UI elements, including the settings menu, keybinding input, and volume control, ensuring smooth interaction and persistent data storage. The next steps involve expanding gameplay mechanics, refining beat map processing, improving UI usability, and most importantly integrating our game engine with Python’s Librosa audio processing library.

Lucas’ status report for 3/8

This week, I continued to work on the core game loop in Unity. I was able to add in a scoring system, and began work on implementing a json controlled beat map instead of dropping randomly generated notes. I also spent considerable time debugging a few things that ended up (mostly) being due to misunderstanding some Unity stuff. I also worked on writing the design review, and had to write up a few sections for it.

Next week, I think I’ll have to spend most of my time integrating; switching my sections of the game to use the game engine Yuhe developed, and writing it up in C++. I’d also like to do some work on the game’s visuals, as right now it doesn’t look great and could use some UI updates.

Team Status Report for 2/22

As of right now, we still have:

  • Michelle working on signal processing/turning audio files into json files containing each note and some basic info on the note
  • Yuhe Working on the menus and UI of the game, as well as the in game beat map editor
  • Lucas working on the core game loop

Similar to last week, our main challenge remains a consistent version control method – for now, we plan to continue progressing in each of our individual areas on our own and eventually integrating all of our code at a later time, while making sure to do some version control on our own sections. Thus, one of the bigger challenges further down the line could be ensuring that our game feels like a seamless and connected experience, which is something we’ll definitely have to dedicate some time to.

Lucas’ Status Report for 2/22

This week, I continued working on the main game while taking some additional time to focus on the design review.

I began implementing some more core features like using a JSON file to generate our beat maps as opposed to just having them place note blocks randomly, and began some of the work on the scoring system. Next week I aim to finish up with these two features and hopefully begin working on some more front end details, to make the game feel a bit more like a game for the time being.

I also made the slides for and presented the design review, which means I took some time outlining our project in a slideshow and doing a bit of presentation preparation. I’ll need to continue this a bit by writing up some of the design review’s written portion this week.

Yuhe’s Status Report for 2/22

This week, I focused on implementing the Upload UI of our game in Unity, which allows users to upload audio files and generate beat maps. I built the file selector to support MP3, WAV, and OGG formats and designed a progress bar to visually indicate the beat map generation process. Additionally, I implemented a text input field for naming the song and a confirmation message to notify users upon successful upload.

One major challenge was handling file compatibility and format validation, ensuring the system properly recognized and processed different audio formats. Next week, I plan to improve error handling and refine the UI layout and connecting all the related game scenes for better user experience.

Michelle’s Status Report for 2/22

This week, I continued working on validation of fixed-tempo audio analysis. The verification method I created last week of playing metronome clicks on beat timestamps while playing the song was not ideal because of the multi-threading timing issues and then human error introduced when taking out the threading and playing the song manually attempting to start at the same time.

This week, I created an alternate version that that uses matplotlib to animate a blinking circle on the timestamps while playing the song using multi-threading. The visual alignment will be more accurate to the gameplay as well. I used 30 FPS since that is the planned frame rate of the game. Here is a short video of a test as an example: https://youtu.be/54ToPpPSpGs

When testing tempo error on the self-composed audio library where we know the ground truth of tempo and beat timestamps, faster songs of 120 BPM or greater had a tempo error of about 21ms which is just outside our tolerance of 20ms. When I tested fast songs with the visual animation verification method, the error was not very perceivable to me. Thus, I think fixing this marginal error is not a high priority and it might be justified to relax the beat alignment error tolerance slightly, at least for the MVP. Further user testing later on after integration will be needed to confirm this.

My progress is on track for our schedule. Next week I plan to wrap up fixed-tempo beat analysis and move onto basic intensity analysis which will be used to determine how many notes should be generated per beat. This is a higher priority than varying-tempo beat analysis. Testing with a wide variety of songs will be needed to finetune our algorithm for calculating the number of notes for each level for the most satisfying gaming experience.

Team Status Report for 2/15

This week, we began implementing the Rhythm Genesis game in Unity, including the User Interface and the core game loop, and also continued work on tempo and beat tracking analysis, calculating the current beat alignment error.

  1. Yuhe worked on the User Interface Layer of the game, implementing the main menu and song selection.
  2. Lucas focused on making the core game loop, implementing the logic for the falling tiles.
  3. Michelle worked on verification methods for beat alignment error in audio analysis.

Some challenges that we are currently facing are figuring the best method of version control. We initially tried using GitHub, but this did not work out since Unity projects are so large. We are now using Unity’s built-in Plastic SCM, which is not super easy to use. Another challenge is that we are discovering that faster tempos are experiencing beat alignment error outside of our acceptance criteria. We will need to spend some more time finetuning how we detect beat timestamps especially for fast songs. As of now there are no schedule changes as the team is on track with our milestones.

A.  Written by Yuhe Ma

Although video games may not directly affect public health or safety, Rhythm Genesis may benefit its users mental well-being and cognitive health. Rhythm games are known to help improve hand-eye coordination, reaction time, and focus. Our game offers an engaging, music-driven experience that enhances people’s dexterity and rhythm skills, which can be useful for both entertainment and rehabilitation. Music itself is known to reduce stress and boost mood, and by letting users upload their own songs, Rhythm Genesis creates a personalized, immersive experience that promotes relaxation and enjoyment. From a welfare standpoint, Rhythm Genesis makes rhythm gaming more accessible by offering a free customizable alternative to mainstream games that lock users into pre-set tracks or costly DLCs. This lowers the barrier to entry, allowing more people to enjoy rhythm-based gameplay. By supporting user-generated content, our game encourages creativity and community interaction, helping players develop musical skills and express themselves. In this way, Rhythm Genesis is not only a game but also a tool for cognitive engagement, stress relief, and self-expression.

B. Written by Lucas Storm

While at the end of the day Rhythm Genesis is just a video game, there are certainly things to consider pertaining to social factors. Video games provide people across cultural and social backgrounds a place to connect – whether that be via the game itself or just finding common ground thanks to sharing an interest – which in my opinion is a very valuable thing. Rhythm Genesis, though not a game that will likely incorporate online multiplayer, will still allow those who are passionate about music and rhythm games to connect with each other through their common interest and connect with their favorite songs and artists through the gameplay.

C. Written by Michelle Bryson

As a stretch goal, our team plans to publish Rhythm Genesis on Steam where users will be able to download and play the game for free. Steam is a widely used game distribution service, so the game will be accessible to a wide range of users globally. Additionally, the game will be designed so that it can be played with only a laptop keyboard. We may consider adding functionality for game controllers, but we will still maintain the full experience with only a keyboard, allowing the game to be as accessible and affordable as possible.

Lucas’ Status Report for 2/15

Made progress on core game loop. Wrote some scripts controlling note blocks and note block destruction/timing, currently note blocks are only randomly generated. I’m on track and want to continue progressing on the game by allowing the generated beats to be controlled by a JSON file instead of random generation, and incorporate more accurate timing feedback and improve the UI.