Team Status Report for Feb 11th

What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

  • Hardware (Sun A): my biggest concern is designing an actuator system that will be safe to use and functional. For instance, because I am using 14 solenoids — not using all 14 at a time though — the amount of current required for the system will be close to 5A. I think that the 18100 power supply may not be suitable for the system. This also means that I can’t use a breadboard to build a system — which means that I would have to solder onto a PCB board to create this actuator system. A little intimidating… But, I am hoping that my experience as a 220 TA and in 474 will be helpful 🙂 However, if this design does not work (fingers crossed), I will have to call for SOS (and hopefully prof. Budnik is available to help). But, if that doesn’t work out either, we’ll just have to build a smaller system with less number of solenoids.
  • Software (Katherine): my concern is gesture recognition. Being able to recognize certain gestures using computer vision may be difficult, but I am confident that we could atleast recognize where a hand with a colored glove is in a screen, so that would be a perfectly good contingency plan.
  • Music Software (Lance): Although I am slightly worried about accidentally designing the music generation algorithm to be out of scope, I am simultaneously excited to begin working on it and creating music. Personally, I’d love to implement all of the ideas discussed in my own post (and more!), but I will focus on gradually building the algorithm from the ground up.

Were any changes made to the existing design of the system (requirements, block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

  • Hardware: we secured a keyboard from the ECE receiving, which means we have 50 dollars more flexibility to buy additional MOSFETs in case I burn some of them. But, other than that, no changes were made so far.
  • CV: Not as of now.
  • Music Component: N/A

Provide an updated schedule if changes have occurred — N/A

This is also the place to put some photos of your progress or to brag about a component you got working.

Recognizing color!

 

 

Our project includes considerations for welfare and societal implications because we hope that our product could allow people who do not have the ability to play music to play music.

Katherine’s Status Report for 02/11

This week, I worked on learning how to use OpenCV and looking for ways to implement tracking color in the screen. First, I worked on getting a screen to pop up using OpenCV so that the user could see themselves move and therefore where their movements would be on the screen, which will be important later. Next, I worked on figuring out how to track colors from the video since we want to use colors to see where someone’s hands are in the screen. The photo shows the code correctly identifying my jacket as red in multiple places. I am able to identify colors red, green, and blue in a picture and the next step will be correlating those to where in the screen they are. I don’t predict this will be very difficult, and I am hoping to get it done tomorrow. My progress is on schedule so far, and for next week I want to work on making sure gestures are recognized by where they are in the screen.

Sun A’s Status Report for February 11th

This week, my team and I worked on preparing for the presentation on Monday and Wednesday. I worked on creating the slides for use case requirements and work distribution. I also reviewed the slides before submitting so it wasn’t just Katherine (the presenter for the first presentation) carrying the burden of the presentation. Once I was finished with the presentation, I reached out to my ECE professors: Sullivan and Budnik.

I reached out to Prof. Budnik to ask him about the logistics of using 14 solenoids and what type of power source I would require and whether I should create a PCB board (proto type version but the full-printed version). Then, I reached out to Prof. Sullivan to ask him when I can start ordering items for the project. Once he gave me a yes to ordering items, I started organizing the supply list to build the external actuator system. I also submitted a form to reserve a keyboard from the ECE receiving — which I will pick up on Monday. The supply list is found here: https://docs.google.com/spreadsheets/d/1KT2KdjSYv742DpTE78x0QGn6WlaC9QxSE4kucoY2tVU/edit?usp=sharing

As of this week, I would say my progress is within the time frame that the team and I have set out for me. I hope to really start ordering items from Amazon and Adafruit (I just got a notification that their mosfets are in stock!).

Introducing Keynetic

Welcome to Carnegie Mellon University: ECE Capstone Projects.

We plan to create a mechanically actuated keyboard that is managed by a microcontroller. The use of a software controller will allow us to actuate a variable number of keys—more than a human could, if necessary. The user would be able to play simple notes and chords on the piano, but not an intricate or high-level song due to limitations in how fast someone would be able to signal the camera. We are limiting our scope by bounding the playing range to two octaves on a piano keyboard, white keys only, and mitigating as many unexpected real-life problems as possible when analyzing camera input. For instance, we are hoping to create an environment where the user can be distinctly recognized by the camera by using CV. Currently, there are no intuitive and widespread solutions for playing the piano without pressing keys or generating sound directly from a computer. Our solution would allow users to play without having to physically touch the piano and indeed, without even sitting down at the bench.