Team Status for 4/30

This week, the team focused largely on the presentation and getting the testing done in time. From previous playtesting, we know that there are a few playability issues that we needed to solve. This was addressed in our presentation as well. However, some of the feedback we got was that more feedback in game visuals was needed, the beatmaps were not well synched with the audio files, and the drums were not recognizing hits.

We have been invited to present at TechSpark, meaning that we need to make all of our aesthetic improvements by Thursday. While we have finals coming up, one of our challenges will be balancing the workload with the final touches we have to make on the project as well as preparing the presentation and the poster as well as the video and the demos.

Overall, this next week will be dedicated to finishing up the project (the individual tasks are mentioned on our blog posts) and finishing up the course documentation by Thursday-Friday.

Shreya’s Update for 4/30

This week, I focused mostly on testing the beat map generator. I had two main goals for the beat map at the beginning of this project. The first was to have the beat map generator take less than the length of the audio file to analyze the audio file and output a beat map generator. To test that, I ended up making about 50 maps and then computing the time it took to test it. I put the graph on our team’s slides. I think that goal is fine.
The second one with the synchronization has been trickier. It’s pretty simple to get the number of beats down to make the game playable for our audience (one piece of feedback we got), but I’m still not super sure what’s happening with the synchronization. I have this idea to only compute the most apparent beats instead of all of them to improve synchronization, but that’s something I will be testing. Unfortunately, I have a few papers and exams due, so I will probably dedicate Monday to working on this problem. This all needs to be done by Thursday for the TechSpark demo. I also have a final Thursday evening so this will likely be done between Monday and Thursday

Shreya’s Update for 4/23

This week, I was focused primarily on testing for the presentation and report for next week. I am running a script right now for the timing portions of my model about determining the average time it takes to run for certain lengths of times (one of my user requirements).

The second user requirement is harder to talk about. I have a few metrics that I I am considering which is comparison to a commonly known beat tracker without regarding false positives or negatives. This is because I included the melody tracking and the commonly used package does not. I was also considering submitting graphs of the areas in an audio file where the algorithm would place a beat to see if we can draw some comparisons for that. I have the wav files created, so having the intensity analysis graphs created should be pretty fast.

Lastly, I think we can incorporate the user feedback and the results we received from play testing.

I do actually have a bug currently in my code which suggests its outputting the wrong time stamps. I am working on trying to fix it, but I am unsure if I can get to it this week. I may have to look more into it next week and focus on the testing analysis for this weekend.

Shreya’s Update 4/16

Unfortunately, this week, I was pretty sick so I was absent from class.

However, I was able to make significant progress with the accuracy of the beat map creator. I ended up changing the energy calculation by making the window smaller than what it was before. I was calculating instantaneous energy in bins of 43 (recommended by the source I was using), but I ended up making the bin of length one instead. This also ended up being significantly faster. Currently, I am analyzing two frequency bands and splitting that into four buttons. I am analyzing the frequency bands separately and for a three and a half minute audio file, it takes about 112 seconds per frequency band to analyze. While that is slightly above my user requirement, tomorrow I plan to condense the algorithm into one loop so hopefully, it should take about 200 seconds total for a single audio file.

Also, as I mentioned last week, (George’s idea), I added a sound to indicate when my tracker found a beat. I have attached a few sound samples before. I have been creating a few additional beat maps as well for the testing phase of our game that we will be going through in a few weeks.

For the progress for the next week, I need to create a lot more files for testing as well as for verifying that my user requirements are met (or where they fall short). Furthermore, I need to create a main function to call everything instead of having to do it myself. This should only take a few hours because the code is pretty modular. Hopefully, I can have all of this done by Wednesday.

For scheduling, I should be on track. Everything is pretty much done, but I am working on some final touches.

NOTE: I’m having trouble uploading the files to the website because it says the maximum size of the files has been exceeded. I can show them later.

Team Update 4/10

This week, we mainly focused on the interim demo. In preparing for the demo, we noticed that our Gantt chart needed to be updated. We have attached the new version here. This creates more specificity on the tasks we are working on and includes some of the delays we have had in reaching our goal.

One setback we had the morning of the demo was the integration between the game code itself and the drum module. Because the first time that this error happened was on the morning of the demonstration, we are still working on debugging it and ensuring that this does not happen again. Furthermore,  because the integration between Windows and Mac for the sound library we want to use for the game seems very difficult to transfer between platforms, we are working on picking and implementing a new package for this. Finally, on the creation of the beat map side, the actual beats are able to be created in a way that is somewhat similar to the song. However, more testing needs to occur. This is further explained in the individual post.

A large portion of our work is on aesthetic changes. The game play code (while mostly playable) had to be updated to reflect a more interesting user experience. The specific changes can be found in the individual team diagrams. Furthermore, the drum apparatus will now have lights to also create a more interesting user experience, so this is also being worked on.
Schedule wise, we are on track to follow the new gantt chart. We do not foresee any reason why we won’t be able to follow this schedule, so we should be able to have a playable game at the end of the semester!

Shreya’s Update 4/10

This week, I was mostly focused on the Interim Demo. Last week, I had the idea of using the STFT to separate the audio file into frequency bins to analyze through the intensity analysis. I thought this would be more efficient due to the variability of the number of bins. However, I was having some trouble implementing it.

I spoke to Professor Sullivan regarding the approach, and he mentioned that it would be better to use bandpass filters and then run an intensity analysis on that. I ended up implementing that with scipy and then running the intensity analysis on the filtered wav files.
On Wednesday, I had an odd bug where there would be no outputs during long stretches of time during the audio file. I was able to fix that by scaling the array, and then analyzing it. That issue is fixed.

Timing wise, while the algorithm takes about six minutes to analyze a three minute audio file, I should be able to get this down to three minutes as per the user requirements through streamlining the code. I will not be doing this for this week because this will likely be a finishing touch after tuning and testing the algorithm.

Currently, the beatmap seems to have an offset correlating to a song. George suggested adding a noise to the audio file whenever my algorithm detects a beat, so I should have that done by Wednesday so I can better understand whether there is an offset- or what exactly is happening.
I am fine on schedule. We have had to change the Gantt Chart which reflects our scheduling differences. The updated Gantt will be on this week’s team update.

 

Shreya’s Update 4/2/2022

This week, I was working on getting the gameplay code installed on my computer. That worked around Wednesday, so I was able to begin visualizing the beat maps.
For the second half of the week, I started looking at melody tracking algorithms. I figured out how to map the frequency bands to the timing with a STFT (graph shown below).

I’m working on extracting the melody itself and getting the time stamps. According to this paper, the algorithm is to determine the frequency of the melody line (orange line on the bottom of the graph) and then extract the time stamps from those frequencies. I’m currently looking at using an external package to get these frequencies, but I should have a weak melody tracker by Monday.

For scheduling, I should be fine. I’m working on the automatic part of the beat tracker and the melody tracker simultaneously, so I’m pretty much on schedule.

Shreya’s Status Report for 3/26

For this week, I was working on testing and verifying the accuracy of the beat map. I mentioned before that I didn’t particularly trust the accuracy of the beat map. While looking at the beat tracking accuracy on the game itself, there was a slight offset and then some issues with the accuracy of the beat tracker. I think this is primarily due to the energy calculation I had instituted.
Originally, I was working with the equation that the scalar multiplier was equal to c = (-.0025714*energy) + 1.5142857

This worked fine on the primarily instrumental audio file. However, with the popular pop songs, I was unable to use this algorithm because the shift in intensities is a lot more varied when looking at the beats. Through testing, I need to find a new variable function for c.
One issue we ran into was that it’s difficult to visualize the beatmaps without the game code itself. The code unfortunately does not work on Windows, so we are having some trouble testing it. That is also reflected on the new gantt chart in our team status report.
Unfortunately, this does put us a few days behind schedule and I may have to implement a simpler melody tracker to compensate, but that will be determined. I created a side testing platform with the Python standard audio processing data, but the visualizer is still more accurate. I can continue work on the scalar c function without it for now.
We are slightly behind schedule, but we should be able to complete the project in time.

Team Status Report for 3/19

This week, we worked on finishing up the individual parts of our assignments that needed to be completed before we can reach MVP. For George, this meant getting the game to read the hits from the drum apparatus and the JSON from the beat mapper. For Stephen, this meant getting the drum module prototype into a functional state and sending its hit information to the game code in a way that was readable. For Shreya, this meant finishing up the beat tracker and working on sending the beat information to the game code to be integrated.

Ideally, we should be able to hit MVP in the coming week. These components are mostly finished and this weekend, we should be finishing up the integration as well.

We have several challenges/concerns over the next week:

  1. Scheduling – if integration is not finished this weekend, we may push our MVP to Friday of this week instead of Monday. This is not a large issue because MVP would suggest most of our game is done. The gantt chart does not need to be updated yet, but it may need to be next week if we do not meet our deadline.
  2. Accuracy of the beat tracker – because Librosa (the Python standard) is not necessarily as reliable as we originally believed, the accuracy of the beat tracker is still unknown. Shreya’s blog post has more details on the specific problem. However, if it is not as accurate as we need, there will need to be more specific time in testing and tweaking to ensure we get the needed results.

Shreya’s Status Update 3/19/2022

This week and at the tail end of Spring Break, I continued work on the beat tracker. I was able to output a set of time stamps from the beat tracker from the given audio file. I also created the json that holds the information. 

Currently, the json assumes that everything will be on the same button. However, this is planned to change post-MVP.

On first glance, the tracker seems to have an alright accuracy. Compared to Librosa’s beat tracker, it is usually with a .2s timer. However, after further analysis of the Librosa beat tracker, I have noticed that their tracker stops about thirty seconds before the end of the audio file. I am unsure if we can use their beat tracker as an accurate count of beats.

The plan for tomorrow and the beginning of next week is to visualize the beat tracking to get a better sense of its accuracy to the song. It probably does need to be tuned and there are a few parameters I can change if it needs to be tuned.

So far, we seem to be mostly on track for having the project done. Fine tuning the beat tracker will need to be done next week with the visualization and then research on melody tracking will likely have to happen.

I feel a few days behind schedule because I don’t have final verification that the beat tracker works, but it should not be too difficult to catch back up