Yuhe’s Status Report for 4/19

This week, I tested and stabilized the Beat Map Editor and integrated it with our game’s core system. I improved waveform rendering sync, fixed note input edge cases, and ensured exported beat maps load correctly into the gameplay module. I worked with teammates to align beatmap formats and timestamps. I also stress-tested audio playback, waveform scrolling, and note placement on both Ubuntu and Windows. Due to virtual audio issues on Ubuntu, I migrated the project to Windows, where SFML audio runs reliably using WASAPI. I updated the build system, linked SFML, spdlog, and verified playback stability across formats. The editor now supports consistent UI latency, note saving/loading, and waveform visualization, meeting our design metrics.

What Did I Learn
To build and debug the Beat Map Editor, I learned C++17 in depth, including class design, memory safety, and STL containers. I switched from Unity to SFML to get better control and performance, and I studied SFML docs and examples to implement UI components, audio playback, waveform rendering with sf::VertexArray, and real-time input handling. I learned to parse sf::SoundBuffer and extract audio samples to draw time-synced waveforms. I also learned to serialize beatmaps using efficient C++ data structures like unordered_set for fast duplicate checking and real-time editing.

I faced many cross-platform dev challenges. Ubuntu’s VM environment caused SFML audio issues due to OpenAL and ALSA incompatibilities. I debugged error logs, searched forums, and simplified my CMake files before deciding to migrate to Windows. There, I set up a native toolchain using Clang++, static-linked dependencies with vcpkg and built-in CMake modules. Audio playback became reliable under Windows using WASAPI.

Verion control is key to our project, and I picked up Git branching, rebasing, and conflict resolution to collaborate smoothly with teammates. I integrated my code with others’ systems, learning how to standardize file formats and maintain compatibility. I utilized YouTube tutorials, GitHub issues, and online forums to solve low-level bugs and these informal resources were often more helpful than official docs. I also  leanred lots of linux and windows commands, C++ syntax, SFML function use cases, git hacks, methods for resolving dependency issues from GPT. Overall, I learned engine programming, dependency management, debugging tools, and teamwork throughout this semester.

Yuhe’s Status Report for 4/12

Over the past two weeks, I have made significant progress in both the development and stabilization of the Beat Map Editor, one of the core components of our rhythm game. My primary efforts were focused on implementing interactive editor features, resolving persistent audio playback issues encountered in the Ubuntu virtual environment, and initiating the migration of the project to a native Windows environment to ensure audio stability. This report details the technical progress, debugging process, cross-platform adaptations, and testing strategies aimed at meeting the engineering design specifications and use-case requirements of our project.

Beat Map Editor Development

The Beat Map Editor serves as the interactive interface for creating and modifying rhythm game charts. I designed and implemented this module with modularity and real-time user interaction in mind. The editor is built around a central BeatMapEditor class that integrates a collection of UI components such as buttons (UIButton), text input boxes (UITextBox), and a waveform visualization module (WaveformViewer).

The WaveformViewer class is responsible for rendering a real-time visual representation of the audio waveform using sf::Vertex primitives. It allows users to scroll and zoom through audio data within a fixed time window (defaulted to 10 seconds), providing an intuitive interface for note placement. I implemented the waveform extraction by sampling peaks from the left audio channel stored in an sf::SoundBuffer.

To facilitate note creation, I implemented two synchronized input fields: a timestamp box and a lane index box. When both fields are populated, the editor stores note data into a std::unordered_set of timestamps and serializes them into a beatmap vector. This mechanism ensures constant-time lookups to prevent duplicate entries and simplifies beatmap export functionality.

I also integrated a minimal but effective UI manager class (UIManager) to streamline event routing and rendering. This modular structure lays the foundation for extensibility and paves the way for implementing future features such as drag-and-drop note placement or real-time preview playback.

Audio Subsystem Debugging (Ubuntu)

During implementation and testing, I encountered recurring failures in the audio subsystem when running the editor in an Ubuntu 22.04 virtual environment. These issues included the inability to initialize or play audio using SFML’s sf::Sound, sporadic freezing during playback, and cryptic runtime errors from the underlying OpenAL backend such as:

AL lib: (EE) alc_open_device: No such audio device
AL lib: (EE) ALCplaybackAlsa_open: Could not open playback device

These errors appeared to stem from the mismatch between the VM’s virtualized audio hardware and OpenAL Soft’s hardware requirements. Even with correct installation and runtime dependencies, the virtual sound card often failed to interface properly with SFML’s audio engine, especially after seek operations or audio buffer reloads.

To isolate the issue, I simplified the CMake configuration to exclude SFML’s audio components initially, later reintroducing them with explicit linkage:

find_package(SFML 2.5 COMPONENTS graphics window system audio REQUIRED)

Despite these adjustments, the playback remained inconsistent. I conducted multiple tests with .wav, .ogg, and .mp3 files and monitored buffer behavior. Logging showed that buffer assignment and sound object lifetimes were valid, but device-level playback would intermittently fail, especially during real-time seeking.

Windows Migration for Audio Stability

Given the limitations of virtualized audio under Ubuntu, I initiated a platform migration to a native Windows environment. On Windows, SFML’s audio subsystem integrates directly with the physical sound card through Windows WASAPI, bypassing the openAL soft backend configuration altogether.

To support this transition, I rebuilt the project’s CMake toolchain using Clang++ and reconfigured dependencies. The following Windows-based setup was finalized:

  • SFML 2.5 (precompiled binaries)

  • spdlog (via vcpkg)

  • tinyfiledialogs (compiled as a static library)

  • Python3 and pybind11 (for future audio analysis integrations)

On Windows, all audio files were loaded and played back reliably. The waveform viewer remained in sync, and no audio crashes or freezes were observed during extended test runs. This confirmed that migrating to Windows would provide the stable execution environment needed for reliable audio interactions during gameplay and editing.

Testing Strategy and Metrics (Verification of My Subsystem)

To ensure the editor meets both engineering design specifications and functional use-case goals, I have implemented and planned a suite of tests that span user interaction, timing accuracy, scoring behavior, and system stability.

Current Testing Status

I manually verified UI responsiveness by logging event triggers and action completions, confirming a latency under 50ms for input actions such as play/pause toggles, seek operations, and waveform interactions. The beat map editor’s file saving and loading functionality was profiled using a stopwatch and consistently completed under 5 seconds for standard-length songs (up to 7 minutes).

Waveform rendering and scrolling behavior was tested with songs across different sample rates and durations. The system maintained accurate synchronization between the progress bar and the waveform display. This also verified correct use of SFML’s sample indexing and buffer durations.

While most manual editor features passed their respective design thresholds, a subset of functionality—such as input latency and beat map accuracy—will require automated benchmarking to validate against exact numerical thresholds.

Planned Testing and Metrics (Verification of the Entire Game)

In the few weeks before the final demo, we will expand testing as follows:

  1. Beat Map Accuracy
    Using annotated reference beat maps and a Python evaluation script, I will calculate average temporal deviation between auto-generated and human-placed notes. A root mean squared error below 20ms will be required to meet alignment standards for BPM ranges between 50–220.

  2. Gameplay Input Latency
    I will record key press events and game state transitions using an SDL2-based input logger or high-frame-rate video capture. These measurements will quantify the time from physical input to in-game response, with a target latency below 20ms.

  3. Persistent Storage Validation
    Save and reload operations for beat maps will be tested using a hash-diff script that validates structural integrity and data completeness. Changes to the note set must persist across game sessions with no corruption.

  4. Frame Rate Stability
    Using SFML’s internal sf::Clock, I will measure delta time across multiple frames during stress test scenarios (waveform scrolling + playback + note placement). The editor must maintain at least 30 frames per second without frame drops.

  5. Error Handling
    Simulated tests with malformed audio files and corrupted beat map data will validate error-handling routines. The editor must exit gracefully or display user-friendly error dialogs instead of crashing.

Team Status Report for 3/29

This week, we made progress toward integrating the audio processing system with the game engine. We implemented a custom JSON parser in C++ to load beatmap data generated by our signal processing pipeline, enabling songs to be played as fully interactive game levels. We also added a result splash screen at the end of gameplay, which currently displays basic performance metrics and will later include more detailed graphs and stats.

In the editor, we refined key systems for waveform visualization and synchronization. We improved timeToPixel() and pixelToTime() mappings to ensure the playhead and note grid align accurately with audio playback. We also advanced snapping logic to quantize note timestamps based on BPM and introduced an alternative input method for placing notes at specific times and lanes, in addition to drag-and-drop.

On the signal processing side, we expanded rhythm extraction testing to voice and bowed instruments and addressed tempo estimation inconsistencies by using a fixed minimum note length of 0.1s. We also updated the JSON output to include lane mapping information for smoother integration.

Next week, we plan to connect the main menu to gameplay, polish the UI, and fully integrate all system components.

Yuhe’s Status Report for 3/29

This week, I focused on implementing and debugging the waveform viewer and its synchronization with the audio playback system. I refined the timeToPixel(float timestamp) and pixelToTime(int x) functions to ensure accurate coordinate transformation between time and screen space, allowing the playhead and notes to sync with real-time audio. I also began implementing the snapping grid logic by quantizing timestamps to the nearest beat subdivision (e.g., 1/4, 1/8) based on BPM and audio offset. For note placement, I encountered difficulties with SFML’s event system for implementing smooth drag-and-drop behavior. As an alternative, I developed an input-based method that allows users to specify the number of notes, their lane indices, and timestamps via a dialog box, which then injects the notes into the grid. These changes aim to improve precision and editor usability. However,  it is still very challenging to ensure waveform-playhead alignment under various zoom levels.

Yuhe’s Status Report for 3/22

This week I worked on designing and implementing key features of the Beatmap Editor for our game. In the first half of the week, I worked on the layout system, structuring the editor interface to accommodate a waveform display area, time axis, note grid overlay, and playback controls. I integrated JSON-based waveform data preprocessed externally to visualize amplitude over time using SFML’s VertexArray class. In the latter half, I worked on the note grid system that allows users to place notes along multiple lanes. Each lane corresponds to a specific input key, and note placement is quantized based on timestamp intervals to ensure rhythm accuracy. A snapping mechanism aligns notes to the closest beat subdivision, improving usability.

I think the major challenge is synchronizing the visual playhead with the music playback and managing coordinate transforms between waveform time and screen space. My goals for next week include implementing audio playback controls and note data serialization.

Yuhe’s Status Report for 3/15

This week, I finished implementing the game engine and UI components for Rhythm Genesis. I improved our existing UI elements—including buttons, textboxes, sliders, labels, and the UI manager—by refining their appearance and interactivity. In addition, I implemented advanced UI components such as a scrollable list and an upload UI to handle audio file uploads.

I also developed a custom C++ to Python bridge, which allows us to call Python module functions directly from our C++ game engine. This integration enabled me to implement a basic parse_music_file function in Python using the Librosa library. Additionally, I created a function that plays both the original song and a beep sound generated based on the extracted beat timestamps, which is useful for testing our beat detection.

On the UI side, I completed the menu scene and upload scene, and I set up placeholders for the settings and beat map editor UI. These components provide a solid foundation for our user interface and will be further integrated as the project progresses.

On Friday, I conducted our weekly meeting with the team as usual. I demonstrated the current state of the game, showing the UI scenes and explaining my implementation of the game engine. I also helped team members set up their development environments by ensuring all necessary dependencies were installed. During the meeting, I discussed with Lucas how he might leverage our existing UI components—or use the SFML sprite system—to implement the game scenes and scoring system. I also took the opportunity to test Michelle’s audio processing algorithms within the game engine.

Overall, it was a productive week with substantial progress in both UI development and audio processing integration, setting us on a good path toward our project goals. For the next few weeks I will be working on implementing the beatmap editor for our game.

Team Report for 3/8

During the week leading up to spring break, our team worked on both the documentation and development of Rhythm Genesis. We completed the design review report collaboratively; On the development side, we transitioned from Unity to our own custom C++ engine with SFML, optimizing performance for low-power systems. Yuhe implemented sprite rendering, UI elements, and settings management, ensuring smooth interaction and persistent data storage. Lucas worked on refining the core game loop, adding a scoring system, and transitioning from randomly generated notes to a JSON-based beat map system. Michelle worked on creating an algorithm for determining the number of notes to be generated based on the onset strength of the beat.

Next steps include expanding gameplay mechanics, integrating Python’s librosa for beat map generation, and improving UI design. The team will also focus on fully integrating Lucas’ contributions into the new engine to ensure a seamless transition.

Part A: Global Factors (Michelle)

Since our rhythm game does not analyze the lyrics of any songs and just analyzes beats, onset strengths, and pitches, any song in any language will work with our game. The only translations needed to globalize our game will be that of the text in the menu. Additionally, we plan to publish the game for free on Steam, which is available in 237 countries, so our game will be widely available internationally.

Part B: Cultural Factors (Lucas)

While our game is, at the end of the day, just a game, it will address a couple cultural factors – namely tradition (specifically music) and language. Our game will allow users to play a game that caters to their specific heritage as it allows them to use the music that they like. Instead of typical rhythm games that cover a small subset of music and therefore likely only represent a limited number of cultural backgrounds, our game will allow users to upload whatever music they’d like, which means that users from any background can enjoy the music that they feel represents them while playing the game.

Part C: Environmental Factors (Yuhe)
Good news—our game doesn’t spew carbon emissions or chop down trees! Rhythm Genesis is just code, happily living on user’s device, consuming only as much electricity as their computer allows. Unlike some bloated game engines that demand high-end GPUs and turn one’s laptop into a space heater, our lightweight C++ engine keeps things simple and efficient. Plus, since a player can play with their keyboard instead of buying plastic peripherals, we’re technically saving the planet… So, while our game can’t plant trees, it is at least not making things worse. Play guilt-free—unless you miss all the notes, then that’s on you 🙂

Yuhe’s Status Report for 3/8

During the week before the spring break, I worked on both the documentation and development of our rhythm game. I did the System Implementation, Testing, and Summary sections of our Design Review Report, detailing the architecture, subsystems, and testing methodologies.

Additionally, I focused on implementing our custom game engine in C++ using SFML, replacing Unity due to its high computational demands and uncessary packages and features that we will not use for our 2D game. I implemented sprite rending system and core UI elements, including the settings menu, keybinding input, and volume control, ensuring smooth interaction and persistent data storage. The next steps involve expanding gameplay mechanics, refining beat map processing, improving UI usability, and most importantly integrating our game engine with Python’s Librosa audio processing library.

Yuhe’s Status Report for 2/22

This week, I focused on implementing the Upload UI of our game in Unity, which allows users to upload audio files and generate beat maps. I built the file selector to support MP3, WAV, and OGG formats and designed a progress bar to visually indicate the beat map generation process. Additionally, I implemented a text input field for naming the song and a confirmation message to notify users upon successful upload.

One major challenge was handling file compatibility and format validation, ensuring the system properly recognized and processed different audio formats. Next week, I plan to improve error handling and refine the UI layout and connecting all the related game scenes for better user experience.

Yuhe’s Status Report for 2/15

Progress This Week
This week, I worked on the User Interface Layer for our game, focusing on implementing the main menu and song selection UI in Unity. Using Unity’s UI Toolkit and Canvas system, I created:

  1. A scrolling song list UI, allowing users to browse available beat maps. This was implemented using Scroll View with a vertical layout and dynamically instantiated UI elements for song entries.
  2. Main menu buttons, including Upload Music, Beat Map Editor, Settings, and Quit, which were placed inside a UI Canvas with Button components and linked to appropriate scene transitions and logic hooks.

Challenges Faced
One major challenge was version control. Unity’s built-in Plastic SCM requires using Unity Cloud for team collaboration, which made syncing my work with Lucas difficult. Granting privileges and managing team members in Unity Cloud was unnecessarily complex, and we ran into access issues when trying to pull/push changes.

Next Steps
Improve UI styling and transitions for a smoother user experience.
Work with Lucas to establish a more reliable version control workflow, possibly moving to Git with LFS for large assets.
Implement beat map metadata display in the song list.