Team Status Report for Feb 18th

We decided to slightly simplify our control method in favor of usability. Instead of controlling single notes with one hand and chords with the other, we have opted to instead control note pitch with one hand and various other parameters (volume, subdivision, rests, etc.) with the other. This should give the user far greater control over what they play. To compensate, we will need to choose what chords to play while accompanying the user. This issue is discussed in more detail in Lance’s post. There is a slight risk of accidentally selecting chords that do not mesh with what the user wishes to play, but in all honesty, because the key is always going to be C major (other another key in a different scale mode with all white keys), this can be handwaved with the justification of interesting harmonic intervals. We don’t plan on placing any chord notes above the melody note, so there should also be no issues there.

We also created the grids needed for the CV recognition in both note playing and generative mode, and the actual generation of notes in the simple note playing mode. The generative mode was defined to use a grid and patterns that we store, and added a block that the user can place their hand in to switch the mode. (more information in Katherine’s post)

As for the hardware side of the project, I finished ordering the parts to create a proto-type version of the actuator system. I only ordered one solenoid from Adafruit to see if I need to resort to ordering a $20+ solenoid for the project because the small solenoid from Adafruit only costs us $8 per piece (and we need 14 solenoids in total). I drew up the circuit diagram for the proto-type and I also created a block diagram for the presentation and got it approved by the team. I will be presenting next week so I have been practicing for that as well!

No update was needed for the schedule because we are all on track šŸ™‚

 

Katherine’s Status Report for 02/18

This week, I worked on generating the on-screen grid for the playing mode and the generative mode, generating notes in playing mode, as well as finalizing how each of these will work. For the note mode, I decided to implement a delay to determine whether someone is actually pressing a key or chord — so a counter starts as soon as one of the notes / chords sees the color enter and if a certain amount of time passes, the note is generated. There is a “button” at the bottom, shown in purple in the attached photo, that will switch to generative mode if either hand is inside the box for 2 seconds. Generative mode is shown as a grid, and we are going to have patterns stored defined by the order of grid boxes passed through. By constantly recording which grid boxes the person moves their hand through, we can determine if any patterns have been matched and use that to generate a sequence of keys. Next, I am going to try to hook up sound to test how the playing is actually going to sound and work more on the generative mode. I am currently a bit off schedule since we didn’t account for initial work in the schedule, but the schedule needs to be reworked after making a lot of changes to howwe are going to approach the project. There was a lot of open time in my section before, so it will be easy to just shift everything down a week and add some more into the schedule.

Shown in the images is note mode and generative mode. I have my finger over the camera so that the graphic can be seen a lot easier.

Team Status Report for Feb 11th

What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans areĀ ready?

  • Hardware (Sun A): my biggest concern is designing an actuator system that will be safe to use and functional. For instance, because I am using 14 solenoids — not using all 14 at a time though — the amount of current required for the system will be close to 5A. I think that the 18100 power supply may not be suitable for the system. This also means that I can’t use a breadboard to build a system — which means that I would have to solder onto a PCB board to create this actuator system. A little intimidating… But, I am hoping that my experience as a 220 TA and in 474 will be helpful šŸ™‚ However, if this design does not work (fingers crossed), I will have to call for SOS (and hopefully prof. Budnik is available to help). But, if that doesn’t work out either, we’ll just have to build a smaller system with less number of solenoids.
  • Software (Katherine): my concern is gesture recognition. Being able to recognize certain gestures using computer vision may be difficult, but I am confident that we could atleast recognize where a hand with a colored glove is in a screen, so that would be a perfectly good contingency plan.
  • Music Software (Lance): Although I am slightly worried about accidentally designing the music generation algorithm to be out of scope, I am simultaneously excited to begin working on it and creating music. Personally, Iā€™d love to implement all of the ideas discussed in my own post (and more!), but I will focus on gradually building the algorithm from the ground up.

Were any changes made to the existing design of the system (requirements, block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

  • Hardware: we secured a keyboard from the ECE receiving, which means we have 50 dollars more flexibility to buy additional MOSFETs in case I burn some of them. But, other than that, no changes were made so far.
  • CV: Not as of now.
  • Music Component: N/A

Provide an updated schedule if changes have occurred — N/A

This is also the place to put some photos of your progress or to brag about a component you got working.

Recognizing color!

 

 

Our project includes considerations for welfare and societal implications because we hope that our product could allow people who do not have the ability to play music to play music.

Katherine’s Status Report for 02/11

This week, I worked on learning how to use OpenCV and looking for ways to implement tracking color in the screen. First, I worked on getting a screen to pop up using OpenCV so that the user could see themselves move and therefore where their movements would be on the screen, which will be important later. Next, I worked on figuring out how to track colors from the video since we want to use colors to see where someone’s hands are in the screen. The photo shows the code correctly identifying my jacket as red in multiple places. I am able to identify colors red, green, and blue in a picture and the next step will be correlating those to where in the screen they are. I don’t predict this will be very difficult, and I am hoping to get it done tomorrow. My progress is on schedule so far, and for next week I want to work on making sure gestures are recognized by where they are in the screen.

Sun A’s Status Report for February 11th

This week, my team and I worked on preparing for the presentation on Monday and Wednesday. I worked on creating the slides for use case requirements and work distribution. I also reviewed the slides before submitting so it wasn’t just Katherine (the presenter for the first presentation) carrying the burden of the presentation. Once I was finished with the presentation, I reached out to my ECE professors: Sullivan and Budnik.

I reached out to Prof. Budnik to ask him about the logistics of using 14 solenoids and what type of power source I would require and whether I should create a PCB board (proto type version but the full-printed version). Then, I reached out to Prof. Sullivan to ask him when I can start ordering items for the project. Once he gave me a yes to ordering items, I started organizing the supply list to build the external actuator system. I also submitted a form to reserve a keyboard from the ECE receiving — which I will pick up on Monday. The supply list is found here:Ā https://docs.google.com/spreadsheets/d/1KT2KdjSYv742DpTE78x0QGn6WlaC9QxSE4kucoY2tVU/edit?usp=sharing

As of this week, I would say my progress is within the time frame that the team and I have set out for me. I hope to really start ordering items from Amazon and Adafruit (I just got a notification that their mosfets are in stock!).