This week, I focused on implementing and debugging the waveform viewer and its synchronization with the audio playback system. I refined the timeToPixel(float timestamp) and pixelToTime(int x) functions to ensure accurate coordinate transformation between time and screen space, allowing the playhead and notes to sync with real-time audio. I also began implementing the snapping grid logic by quantizing timestamps to the nearest beat subdivision (e.g., 1/4, 1/8) based on BPM and audio offset. For note placement, I encountered difficulties with SFML’s event system for implementing smooth drag-and-drop behavior. As an alternative, I developed an input-based method that allows users to specify the number of notes, their lane indices, and timestamps via a dialog box, which then injects the notes into the grid. These changes aim to improve precision and editor usability. However, it is still very challenging to ensure waveform-playhead alignment under various zoom levels.