Team Status Report for 2/22

As of right now, we still have:

  • Michelle working on signal processing/turning audio files into json files containing each note and some basic info on the note
  • Yuhe Working on the menus and UI of the game, as well as the in game beat map editor
  • Lucas working on the core game loop

Similar to last week, our main challenge remains a consistent version control method – for now, we plan to continue progressing in each of our individual areas on our own and eventually integrating all of our code at a later time, while making sure to do some version control on our own sections. Thus, one of the bigger challenges further down the line could be ensuring that our game feels like a seamless and connected experience, which is something we’ll definitely have to dedicate some time to.

Lucas’ Status Report for 2/22

This week, I continued working on the main game while taking some additional time to focus on the design review.

I began implementing some more core features like using a JSON file to generate our beat maps as opposed to just having them place note blocks randomly, and began some of the work on the scoring system. Next week I aim to finish up with these two features and hopefully begin working on some more front end details, to make the game feel a bit more like a game for the time being.

I also made the slides for and presented the design review, which means I took some time outlining our project in a slideshow and doing a bit of presentation preparation. I’ll need to continue this a bit by writing up some of the design review’s written portion this week.

Yuhe’s Status Report for 2/22

This week, I focused on implementing the Upload UI of our game in Unity, which allows users to upload audio files and generate beat maps. I built the file selector to support MP3, WAV, and OGG formats and designed a progress bar to visually indicate the beat map generation process. Additionally, I implemented a text input field for naming the song and a confirmation message to notify users upon successful upload.

One major challenge was handling file compatibility and format validation, ensuring the system properly recognized and processed different audio formats. Next week, I plan to improve error handling and refine the UI layout and connecting all the related game scenes for better user experience.

Michelle’s Status Report for 2/22

This week, I continued working on validation of fixed-tempo audio analysis. The verification method I created last week of playing metronome clicks on beat timestamps while playing the song was not ideal because of the multi-threading timing issues and then human error introduced when taking out the threading and playing the song manually attempting to start at the same time.

This week, I created an alternate version that that uses matplotlib to animate a blinking circle on the timestamps while playing the song using multi-threading. The visual alignment will be more accurate to the gameplay as well. I used 30 FPS since that is the planned frame rate of the game. Here is a short video of a test as an example: https://youtu.be/54ToPpPSpGs

When testing tempo error on the self-composed audio library where we know the ground truth of tempo and beat timestamps, faster songs of 120 BPM or greater had a tempo error of about 21ms which is just outside our tolerance of 20ms. When I tested fast songs with the visual animation verification method, the error was not very perceivable to me. Thus, I think fixing this marginal error is not a high priority and it might be justified to relax the beat alignment error tolerance slightly, at least for the MVP. Further user testing later on after integration will be needed to confirm this.

My progress is on track for our schedule. Next week I plan to wrap up fixed-tempo beat analysis and move onto basic intensity analysis which will be used to determine how many notes should be generated per beat. This is a higher priority than varying-tempo beat analysis. Testing with a wide variety of songs will be needed to finetune our algorithm for calculating the number of notes for each level for the most satisfying gaming experience.

Team Status Report for 2/15

This week, we began implementing the Rhythm Genesis game in Unity, including the User Interface and the core game loop, and also continued work on tempo and beat tracking analysis, calculating the current beat alignment error.

  1. Yuhe worked on the User Interface Layer of the game, implementing the main menu and song selection.
  2. Lucas focused on making the core game loop, implementing the logic for the falling tiles.
  3. Michelle worked on verification methods for beat alignment error in audio analysis.

Some challenges that we are currently facing are figuring the best method of version control. We initially tried using GitHub, but this did not work out since Unity projects are so large. We are now using Unity’s built-in Plastic SCM, which is not super easy to use. Another challenge is that we are discovering that faster tempos are experiencing beat alignment error outside of our acceptance criteria. We will need to spend some more time finetuning how we detect beat timestamps especially for fast songs. As of now there are no schedule changes as the team is on track with our milestones.

A.  Written by Yuhe Ma

Although video games may not directly affect public health or safety, Rhythm Genesis may benefit its users mental well-being and cognitive health. Rhythm games are known to help improve hand-eye coordination, reaction time, and focus. Our game offers an engaging, music-driven experience that enhances people’s dexterity and rhythm skills, which can be useful for both entertainment and rehabilitation. Music itself is known to reduce stress and boost mood, and by letting users upload their own songs, Rhythm Genesis creates a personalized, immersive experience that promotes relaxation and enjoyment. From a welfare standpoint, Rhythm Genesis makes rhythm gaming more accessible by offering a free customizable alternative to mainstream games that lock users into pre-set tracks or costly DLCs. This lowers the barrier to entry, allowing more people to enjoy rhythm-based gameplay. By supporting user-generated content, our game encourages creativity and community interaction, helping players develop musical skills and express themselves. In this way, Rhythm Genesis is not only a game but also a tool for cognitive engagement, stress relief, and self-expression.

B. Written by Lucas Storm

While at the end of the day Rhythm Genesis is just a video game, there are certainly things to consider pertaining to social factors. Video games provide people across cultural and social backgrounds a place to connect – whether that be via the game itself or just finding common ground thanks to sharing an interest – which in my opinion is a very valuable thing. Rhythm Genesis, though not a game that will likely incorporate online multiplayer, will still allow those who are passionate about music and rhythm games to connect with each other through their common interest and connect with their favorite songs and artists through the gameplay.

C. Written by Michelle Bryson

As a stretch goal, our team plans to publish Rhythm Genesis on Steam where users will be able to download and play the game for free. Steam is a widely used game distribution service, so the game will be accessible to a wide range of users globally. Additionally, the game will be designed so that it can be played with only a laptop keyboard. We may consider adding functionality for game controllers, but we will still maintain the full experience with only a keyboard, allowing the game to be as accessible and affordable as possible.

Lucas’ Status Report for 2/15

Made progress on core game loop. Wrote some scripts controlling note blocks and note block destruction/timing, currently note blocks are only randomly generated. I’m on track and want to continue progressing on the game by allowing the generated beats to be controlled by a JSON file instead of random generation, and incorporate more accurate timing feedback and improve the UI. 

Yuhe’s Status Report for 2/15

Progress This Week
This week, I worked on the User Interface Layer for our game, focusing on implementing the main menu and song selection UI in Unity. Using Unity’s UI Toolkit and Canvas system, I created:

  1. A scrolling song list UI, allowing users to browse available beat maps. This was implemented using Scroll View with a vertical layout and dynamically instantiated UI elements for song entries.
  2. Main menu buttons, including Upload Music, Beat Map Editor, Settings, and Quit, which were placed inside a UI Canvas with Button components and linked to appropriate scene transitions and logic hooks.

Challenges Faced
One major challenge was version control. Unity’s built-in Plastic SCM requires using Unity Cloud for team collaboration, which made syncing my work with Lucas difficult. Granting privileges and managing team members in Unity Cloud was unnecessarily complex, and we ran into access issues when trying to pull/push changes.

Next Steps
Improve UI styling and transitions for a smoother user experience.
Work with Lucas to establish a more reliable version control workflow, possibly moving to Git with LFS for large assets.
Implement beat map metadata display in the song list.

Michelle’s Status Report for 2/15

This week I worked on building a verification process for determining beat alignment error. I created some songs as tests so that I know the ground truth of the tempo and beat timestamps for those songs with some math. Then, I compared this with what the beat tracking observed. I focused on tempo invariant songs. For most tempos, the beat alignment error fell within the acceptance criteria of 20ms. For tempos of about 120 BPM or greater, the beat alignment error was 21+ milliseconds. Further work is needed to finetune that.

I also created another method of testing to use for real songs where the exact tempo and beat alignments are unknown. I built a Python program extracts beat timestamps from an audio file then plays back the audio while playing a click sound at the observed beat timestamps. I initially used threading to play the song and the metronome clicks at the same time, but this introduced some lagging glitches in the song. I removed the threading and just played the generated metronome while separately playing the song myself, attempting to minimize human error in starting the song at the exact correct time. With this method, the timestamps sounded highly accurate and stayed on beat throughout.

The audio analysis portion of the project is on schedule. Next week, I want to see if I can find some way to reduce the beat alignment error for songs that are above 120 BPM.

Team Status Report for 2/8

This week, our team focused on getting things set up for development, refining the game’s high-level architecture design, and making sure we’re all comfortable with the tools we’ll be using. We also had our recurring Zoom meeting on 2/7 to discuss our next steps, including implementing core game loop and basic UI in Unity, experimenting with note spawning, and most importantly preparing for our design review slides and presentation!

  1. Yuhe worked on the game architecture, making sure Unity’s UI, game logic, and beat map system all fit together. She also explored Unity’s JSON handling and file I/O to ensure smooth beat map storage and retrieval.
  2. Lucas focused on getting familiar with C# and Unity by working through tutorials and coding simple programs to test key game mechanics. He also researched how to best represent the game’s core data structures.
  3. Michelle explored Python’s librosa library for beat detection, successfully analyzing tempo and beats for simple, fixed-tempo songs. She also started looking into Onset Strength as a way to measure intensity, which could influence gameplay elements like note density or animations.

One challenge we identified is handling tempo variations in more complex songs, which will require deeper understanding of audio analysis techniques. For the first month, we’ll focus on detecting beats in simple, steady-tempo music. Our next steps include starting game implementation in Unity, testing note spawning based on beat maps, and refining our design choices for the upcoming presentation. Solid progress so far, and we’re looking forward to seeing our first playable version take shape next week!

Yuhe’s Status Report for 2/8

This week, I focused on game architecture design and getting things set up for development. I spent a good chunk of time figuring out how to structure the project so everything runs smoothly and making sure Unity’s UI, game logic, and beat map system all fit together.

I went through some Unity 2D game tutorials, mostly on handling input, spawning objects (for the falling notes), and setting up animations. I also read up on how Unity handles JSON files since that’s how we’ll store beat maps. On the C# side, I checked out best practices for handling game logic and timing. I also played around with Unity’s file I/O to make sure we can load and save beat maps without issues.

I also set up a GitHub repo so we can track progress and keep things organized. Next up, I’ll start working on spawning notes based on the beat map data and getting a simple scoring system running. Overall, solid progress this week and I am excited to start developing the game!