Angel’s Status Report for 4/2

I was able to accomplish a lot this week. First, I updated the python code to decode the bit mapping sent from the Arduino Nano and print of the played the note (C, C#,etc). Secondly, I was able to write code in Sonic Pi to receive messages over OSC to determine when/which flute sample to play. I was able to determine how to stop playing the current sample when a new note needs to played by killing the object in thread that contained the currently played sample, working around Sonic Pi’s requirement of a duration when playing a sound. This entailed sending a list of data [note, newNote, stop] to Sonic Pi from the python script. Hence, I was able to play notes in response to fingerings being pressed on the flute controller. The latency is quite low (definitely less than half a second), but I need to figure out how to time this and get numbers.

I also worked with Vivian this week to make the flute controller in Techspark, and cut the PVC/drilled holes for the buttons.

Tomorrow, I will work on integration with Vivian and Judenique so we can prepare for our upcoming interim demo.

Besides integration, I plan to start writing code to incorporate breath control into the project, so only playing notes when the user is blowing into the controller. I also plan to start working with Judenique as I am able to send her fingering data now. I am still on schedule, and nearing the completion of my portion, so I’ll most likely start working more with Judenique and Vivian in the upcoming weeks.



Leave a Reply

Your email address will not be published. Required fields are marked *