Nora’s Status Report for 3/11

I spent a majority of the week working on the design report with my team members. As a result, I was not able to conduct thorough testing of the scheduling. I did implement additional code to handle rests in the music as well as setting the GPIO pins low when a note is finished playing. This will handle cases where the same note is played twice or more in a row.

Since I was not able to test the timing accuracy of the scheduling, I am slightly behind schedule. However, I plan to catch up by allocating time tomorrow to fully test out the new code and verify the timing accuracy of the current code using a metronome.

Next week, I plan on modularizing the GPIO pin mapping and working with Aden to integrate with the actuators. I will need to check if the timing still holds once the hardware is involved, since one bottleneck we encountered earlier on was that time it took to activate the solenoids limited our max frequency.

Nora’s Status Report for 2/25

This week, I completed the first pass of the note scheduling on the Raspberry Pi using the music21 library to parse the notes, and the changes are now pushed to our new GitHub repository. The parse function is able to successfully convert XML files into Stream objects which hold hierarchical information on the Parts, Measures, and Notes (also music21 object types). Based off this info, I was able to extract values for the tempo and beatTypes which I used to calculate the number of ms each quarter note is worth. Dimensional analysis of calculation is shown below:

Additional Resource

From there, we can then scale it by our smallest note value to get the time delay that we will increment between each “timestamp” (a slice in time in which we can determine which notes are active to tell which keys are pressed or not) For simplicity, the current code assumes the smallest valued note we can play is a 16th note. This can be easily turned into a variable later on.

The function that does the main chunk of work is called

schedule

which iterates through the notes stored in the Stream and sets the bits in a bitmask based on the note being played (using mappings expressed through two changeable dictionaries) and then updates the notesToPlay dictionary that matches the bitmask to different timestamps.

As mentioned in the design review presentation, I utilized the pigpio library for controlling the GPIO pins simultaneously by setting and clearing a bank of bits. To visually see the results of the scheduling, I attached the GPIO outputs to LEDs arranged in a keyboard-like fashion. Pictured below is the setup, as well as the measured BPM from playing a simple quarter note C scale at 60 bpm. The code seemed to work pretty well!

Additionally, here is a video of it playing the following chromatic scale. It appears to capture the different note values pretty well.

The task I was scheduled to complete for this week was to “convert XML to scheduled GPIO inputs using music21.” As discussed above and with some help from Aden, I have met this goal satisfactorily, although I will need to work on ironing out a few extra details such as playing successive notes and accounting for rests. Therefore overall I am on schedule.

For the next week, I plan to continue improving the scheduling algorithm and conduct more thorough testing beyond using the built-in GuitarTuna metronome. However, I do anticipate spending a decent amount of time dedicated to writing up the Design Review Report with my teammates, so I may not be able to spend as much time on these tasks.

Nora’s Status Report for 2/18

This week I received the Raspberry Pi 4 from ECE Receiving. After picking up the RPi, I worked on setting it up so that we could control and program it. Since we didn’t have a wired keyboard, I had to install the VNC Viewer application to ssh into the RPi using its IP address. This allowed us to open up the RPi’s desktop and type inputs to it. After seeing the full capabilities of the RPi, we considered migrating all of the code, including the UI and OMR Python application, onto the RPi to reduce latency when starting and stopping. However, given our current toolchain, we require a Windows device to run Audiveris, so we decided the microseconds-milliseconds we save from integrating together was not worth the added effort of setting up a new environment and OS on the RPi.

On Saturday, I worked with Aden on testing the exploratory solenoids and transistors that we bought. I wrote a simple Python file using a basic GPIO library to set pins high so that we could test the power switching. The output voltage from the RPi was above the necessary threshold voltage of the MOSFET, so the power switching was quite seamless.

One challenge we encountered when testing the solenoids was that our initial code using the sleep function from the time library could only get the solenoids to depress and retract about four times a second (video linked here) which is below the six times a second target frequency in our use-case requirements. Since the sleep function takes an argument in seconds, one possibility for the bottleneck is that it couldn’t delay for a shorter amount of time. I will be working on installing and using the pigpio library so that we can have microsecond delays instead of the limited sleep function. However, if the bottleneck on the timing ends up being due to the hardware itself, then we will need to rescope that requirement and change the code that limits the max tempo/smallest note value to account for this frequency cap.

One big change we addressed in our team status report was the switch to the music21 library for parsing rather than a custom parsing process. This resulted in me being behind schedule, but the new Gantt chart developed for our Design Review Presentation accounts for this change.

Next week I will be looking more into using music21 to extract the notes from the data and converting them into GPIO high/low signals. I was able to convert the test XML file that Rahul generated into a Stream object, as shown in the image below. However, as you can see, there are several nested streams before the list of notes is accessible, so I will need to work on un-nesting the object if I want to be able to iterate through the notes correctly.

As for the actual process for scheduling, I will try to explain the vision here. We can have a count that keeps track of the current “time” unit that we are at in the piece, where an increment of 1 is 1 beat. This corresponds nicely to the offsets from music21 (i.e. the values in the {curly braces}). We can also calculate the duration of each beat (in milliseconds) by calculating 60,000/tempo. So we will iterate through the notes and at a given time, if a note is being played, we’ll set the GPIO pin associated with it to high and we will set the rest of the pins to low (which is easily accomplished with batch setting from the pigpio library). Thus we will also need a mapping function that connects the note pitch to a specific GPIO pin.

Overall, the classes I have taken that have helped me this week are 18-220 and 18-349. Knowledge of transistors and inductors as well as experience with the lab power supplies that I gained from 18-220 helped me when working on the circuitry. Embedded systems skills from 18-349 was very helpful when looking at documentation and datasheets for the microcontroller and circuit components respectively.

Nora’s Status Report for 2/11

At the start of this past week I helped polish the Proposal Presentation slides in preparation for the week’s presentations. In particular, I created the Physical Implementation Model mockup as well as the block diagram. I also helped Aden prepare for the oral presentation.

Since my main responsibility area for the project is handling the microcontroller component, I have been brainstorming ways to organize the information from the XML so that the microcontroller can translate each individual note into scheduled signals that can be sent to the transistors to switch our solenoids on or off. Currently, I am envisioning a dictionary-like structure whose keys are certain “time” pointes (where the length of a time unit are determined by the tempo from the parser) and the values are a list of tasks to execute at the corresponding time (i.e. turn a specific solenoid on or off). When the microcontroller is playing a song, it will increment a counter for the time and retrieve the corresponding tasks at that time.

Since the XML file contains a lot of extraneous text, I will be working on providing Rahul with specifications for what should be kept in the file that is sent to the microcontroller. At a minimum, I will need the tempo value (or text description) of the piece and a sequence of notes with their pitches and duration.

We are planning on using a 30V 5A DC power supply that I have on hand to power our solenoids. Based on some preliminary power calculations using data from last semester’s Talking Piano group (since we don’t have access to the physical components to test yet), we determined that we would need a max power of (12V)(1.5A)(5 solenoids) = 90W for 5 solenoids turned on at one time. If the solenoids we end up buying draw comparable current, we may have to reduce the number of active solenoids to three at a time so that we are below the 5A limitation of our power supply. To lower the overall power so we meet the 60W average power requirement, I looked into the possibility of using a PWM servo driver for switching the transistors so that we can lower the average power. However, since PWM inputs for solenoids are used for position control, we would have to make sure it is still able to press a key down fully.

I requested a Raspberry Pi 4 from the 18-500 parts inventory, so I am hoping to set it up and familiarize myself with the environment in the coming week.

While I do not have any physical code written, I am ready to start implementing the data structures and initial algorithm for the scheduler. Thus I believe I am on schedule and should have completed a first pass implementation by the next status report.