Team Status Report for April 29th

What are the most significant risks that could jeopardize the success of theproject? How are these risks being managed? What contingency plans are ready?

HW: I am a little scared that with all of the testing, we have been doing that our solenoids will burn out (from “too much” use). However, we have some extra solenoids and have been using MIDI in place of the solenoids for testing at home.

SW (CV): I think the biggest risk at this point is integration between mine and Lances part, but it should be fine. We still have a working MVP either way.

SW (Music): Serial pipeline failure. It’s robust and can accurately handle errors, but during recent testing with a robotics capstone project it has had some strange errors where it randomly disconnects (from a Raspberry Pi). A disconnect during our demo wouldn’t be the end of the world, but it would be annoying, especially since both programs would need to be restarted. Integration isn’t much of a concern in my opinion. I just need to know what values to expect on the Python side, and I just need pins on the Arduino side.

Were any changes made to the existing design of the system (requirements,block diagram, system spec, etc)? Why was this change necessary, what costsdoes the change incur, and how will these costs be mitigated going forward?

HW: N/A — but if the solenoids were to burn out, then we will be using MIDI keys as a replacement.

SW (CV): Working on symbol detection, as mentioned before, but thinking of combining it with color detection as described in my post.

SW (Music): Not really a huge design change, but I’m no longer representing music in 32nd notes. Instead I’ve jumped down to 8th notes so that we can play at higher tempos and jump through more chord progressions quicker. This will lead to a smoother user experience in generative mode as well.

Provide an updated schedule if changes have occurred.

HW: N/A

SW: N/A

SW (Music): N/A

This is also the place to put some photos of your progress or to brag about acomponent you got working

HW: N/A

 

(Extra question for this week) List all unit tests and overall system test carried out for experimentation of the system. List any findings and design changes made from your analysis of test results and other data obtained from the experimentation.

HW: when building the circuit, I tested individual solenoids to ensure that they were connected correctly and tested them with the rest of the system to ensure that they were connected/integrated correctly with the entire system. Some of the unit tests I conducted were: just turning on one solenoid at a time, randomizing the arduino keys and playing at most 4 at a time. For “old” solenoids, I noticed that when more than one key was playing concurrently, there was a bit of lagging time, which I roughly timed it as < 100ms — this latency issue was purely from the age of the solenoids. As for the general latency between the arduino code and the solenoids, 40ms latency was observed — which was not significant in our system. I am hoping to measure the latency between the user playing the key using our object-detection system to the actuators playing the keyboard this week — however, in our informal testing, the latency was approximately 150ms (this was measure as soon as we saw in the print statement that a certain key was detected — however, there is definitely some latency in reading/detecting the object itself).

SW(CV): One test I needed to run was the time from when the user puts the color into the right box and when the code recognizes it. This is a test that I expect to change when I am finished with the final implementation of combining symbol detection + color detection, but I doubt it will change very much. I measured about 600 ms on average, which is not bad, but I do think it will change by demo time. The other test I ran was the rate of detection. This can be negatively affected by other objects that have a red hue that the camera picks up. I measured it to be accurate about 92% of the time. However, I do expect to get a better value when I fully integrate the symbol detection in.

SW (Music): I’ve done a lot of unit testing for both music and serial code. For music, my main concern was the generative mode. There’s only one thing to test, and that’s the generation output. I use normal distributions to pick notes and rhythms, so what I did was just generate a bunch of melodies and checked to see if they made sense with the progressions that they were over. This was mostly opinion based, so I just vibe checked all of them. I also checked to make sure that the rhythms were reasonable. I wasn’t really satisfied with those results, but since I’m switched to a slower model, I have a lot of chances to tweak generation probabilities. I did do some brief tests on generating rests, but honestly it just wasn’t really worth it. Notes naturally die out anyways, so there’s not much point in actively “playing” the rests (even though good musicians should do that!) since our project is more geared towards simple things. For serial/Arduino, a huge majority of my tests were done during the heat of debugging, so I don’t really remember them exactly. However, it was things like “stepping through each and every byte I’m supposed to send using the Serial Monitor” and “sending a single byte and tracking every location it goes to.” This evolved into tests with Python interfacing like “sending a sequence of data and having the Arduino report success back” which inevitably evolved to “sending the same string of data and then reading in as much data as humanly possible from the Arduino so that I can figure out where my bytes are going.” Then once the pipeline worked (or was at least half-functional) I started sending melodies off to the Arduino, then chords. For the majority of the semester, I haven’t had any solenoids to test with, so at first I started off with just having the Arduino report what notes it was playing. Then, after Professor Sullivan recommended trying out MIDI output as well, I remembered a MIDIUSB library I briefly used when trying to build an electric vibraphone a few years ago. I shook the dust off of that and made it so that I could have the Arduino interface with FL Studio, and I started testing notes and melodies that way. FL Studio doesn’t really ensure that circuit integration will go well though, so I also set up a simple LED circuit that I could use to test melodies on. It’s currently 7 LEDs, and I was going to extend it to 14, but since we only have about 10 solenoids, I’ll most likely just stick with the 7 and pass extra notes into FL studio. The tests here were just in sending notes, chords, melodies, everything. I also did some simple timing tests by flashing LEDs at various tempos, etc. As long as the Arduino is in time, aged solenoids will sound rubato at best and drunk at worst, which still isn’t terrible at lower tempos. I plan on continuing with these tests throughout integration and up until the demo.

Leave a Reply

Your email address will not be published. Required fields are marked *