Michelle’s Status Report for 3/8

This week, I worked on creating an algorithm for determining the number of notes to be generated based on the onset strength of the beat. Onset strength at time t is determined by max(0, S[f, t] – ref[f, t – lag]) where ref is S after local max filtering along the frequency axis and S is the log-power Mel spectrogram.

Since a higher onset strength implies a more intense beat, it can be better represented in the game by chords. Likewise, a weaker onset strength would generate a rest or a single notes. Generally we want more single notes than anything else, with three note chords being rarer than two note chords. These percentiles can be easily adjusted later on during user testing to figure out the best balance.

My progress is on schedule. Next week, I plan to refactor my explorations with Librosa into modular functions to be easily integrated with the game. I will also be transitioning from working on audio analysis to working on the UI of the game.

Michelle’s Status Report for 2/22

This week, I continued working on validation of fixed-tempo audio analysis. The verification method I created last week of playing metronome clicks on beat timestamps while playing the song was not ideal because of the multi-threading timing issues and then human error introduced when taking out the threading and playing the song manually attempting to start at the same time.

This week, I created an alternate version that that uses matplotlib to animate a blinking circle on the timestamps while playing the song using multi-threading. The visual alignment will be more accurate to the gameplay as well. I used 30 FPS since that is the planned frame rate of the game. Here is a short video of a test as an example: https://youtu.be/54ToPpPSpGs

When testing tempo error on the self-composed audio library where we know the ground truth of tempo and beat timestamps, faster songs of 120 BPM or greater had a tempo error of about 21ms which is just outside our tolerance of 20ms. When I tested fast songs with the visual animation verification method, the error was not very perceivable to me. Thus, I think fixing this marginal error is not a high priority and it might be justified to relax the beat alignment error tolerance slightly, at least for the MVP. Further user testing later on after integration will be needed to confirm this.

My progress is on track for our schedule. Next week I plan to wrap up fixed-tempo beat analysis and move onto basic intensity analysis which will be used to determine how many notes should be generated per beat. This is a higher priority than varying-tempo beat analysis. Testing with a wide variety of songs will be needed to finetune our algorithm for calculating the number of notes for each level for the most satisfying gaming experience.

Team Status Report for 2/15

This week, we began implementing the Rhythm Genesis game in Unity, including the User Interface and the core game loop, and also continued work on tempo and beat tracking analysis, calculating the current beat alignment error.

  1. Yuhe worked on the User Interface Layer of the game, implementing the main menu and song selection.
  2. Lucas focused on making the core game loop, implementing the logic for the falling tiles.
  3. Michelle worked on verification methods for beat alignment error in audio analysis.

Some challenges that we are currently facing are figuring the best method of version control. We initially tried using GitHub, but this did not work out since Unity projects are so large. We are now using Unity’s built-in Plastic SCM, which is not super easy to use. Another challenge is that we are discovering that faster tempos are experiencing beat alignment error outside of our acceptance criteria. We will need to spend some more time finetuning how we detect beat timestamps especially for fast songs. As of now there are no schedule changes as the team is on track with our milestones.

A.  Written by Yuhe Ma

Although video games may not directly affect public health or safety, Rhythm Genesis may benefit its users mental well-being and cognitive health. Rhythm games are known to help improve hand-eye coordination, reaction time, and focus. Our game offers an engaging, music-driven experience that enhances people’s dexterity and rhythm skills, which can be useful for both entertainment and rehabilitation. Music itself is known to reduce stress and boost mood, and by letting users upload their own songs, Rhythm Genesis creates a personalized, immersive experience that promotes relaxation and enjoyment. From a welfare standpoint, Rhythm Genesis makes rhythm gaming more accessible by offering a free customizable alternative to mainstream games that lock users into pre-set tracks or costly DLCs. This lowers the barrier to entry, allowing more people to enjoy rhythm-based gameplay. By supporting user-generated content, our game encourages creativity and community interaction, helping players develop musical skills and express themselves. In this way, Rhythm Genesis is not only a game but also a tool for cognitive engagement, stress relief, and self-expression.

B. Written by Lucas Storm

While at the end of the day Rhythm Genesis is just a video game, there are certainly things to consider pertaining to social factors. Video games provide people across cultural and social backgrounds a place to connect – whether that be via the game itself or just finding common ground thanks to sharing an interest – which in my opinion is a very valuable thing. Rhythm Genesis, though not a game that will likely incorporate online multiplayer, will still allow those who are passionate about music and rhythm games to connect with each other through their common interest and connect with their favorite songs and artists through the gameplay.

C. Written by Michelle Bryson

As a stretch goal, our team plans to publish Rhythm Genesis on Steam where users will be able to download and play the game for free. Steam is a widely used game distribution service, so the game will be accessible to a wide range of users globally. Additionally, the game will be designed so that it can be played with only a laptop keyboard. We may consider adding functionality for game controllers, but we will still maintain the full experience with only a keyboard, allowing the game to be as accessible and affordable as possible.

Michelle’s Status Report for 2/15

This week I worked on building a verification process for determining beat alignment error. I created some songs as tests so that I know the ground truth of the tempo and beat timestamps for those songs with some math. Then, I compared this with what the beat tracking observed. I focused on tempo invariant songs. For most tempos, the beat alignment error fell within the acceptance criteria of 20ms. For tempos of about 120 BPM or greater, the beat alignment error was 21+ milliseconds. Further work is needed to finetune that.

I also created another method of testing to use for real songs where the exact tempo and beat alignments are unknown. I built a Python program extracts beat timestamps from an audio file then plays back the audio while playing a click sound at the observed beat timestamps. I initially used threading to play the song and the metronome clicks at the same time, but this introduced some lagging glitches in the song. I removed the threading and just played the generated metronome while separately playing the song myself, attempting to minimize human error in starting the song at the exact correct time. With this method, the timestamps sounded highly accurate and stayed on beat throughout.

The audio analysis portion of the project is on schedule. Next week, I want to see if I can find some way to reduce the beat alignment error for songs that are above 120 BPM.

Michelle’s Status Report for 2/8

This week I focused on getting familiar with the Librosa library and exploring its functions that will be relevant to our project. I was able to demonstrate tempo analysis and beat extraction working for songs with invariable tempo. I also started looking into ways to determine intensity, as that would be useful information for the game whether it affects number of notes or fun animations. Onset Strength may be a useful metric for intensity. Here are some notes from my exploration so far: https://docs.google.com/document/d/1x1v-kjtmkiGLtpr_eNxQ_WqPQmr7fqHTX-byE4wmWC8/edit?usp=sharing

I also completed a basic tutorial for Unity game engine, as I have never used it before. I created a working Flappy Bird game to get familiar with Unity and C#. This will be useful for when I start working on the game UI later on in the project timeline.

My progress is on schedule. Next week, I plan to further explore Onset Strength as a metric and figure out how to extract that from songs. I also want to look into how to analyze the tempo and beats of songs with variable tempo, for example most classical music.