Yuqi’s status Report for Feb 25

1, What did you personally accomplish this week on the project?

List of items:

    1. Arduino Nano: Amazon $23.99
    2. Tilt Sensor:  DigiKey$17.95*2
    3. Radio Transmitter:  Amazon $11.99*4
    4. Light Sensor: DigiKey $2.50
    5. Battery: Amazon $13.99*4
    6. LED: Amazon 100Pcs $5.95

All DigiKey items use Fedex Ground (arrive in 2 days). $6.99

2,  Is your progress on schedule or behind? If you are behind, whatactions will be taken to catch up to the project schedule? What deliverables do you hope to complete in the next week?

It is on schedule. I will learn to use the items (especially the sensors) above after they arrive.

Team Status Report for Feb 25

  • What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

After resolving the risks with our software last week, we believe that our most significant risks lie within our hardware. Because the quality of ICs (especially the tilt sensor and the radio chip) is not guaranteed, we will start component testing immediately after our chips arrive. If there is a defect, we will order another one (maybe from another seller) within one day.

 

  • Were any changes made to the existing design of the system? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

We did change the battery for our gloves from alkaline batteries to rechargeable lipo batteries. Lipo batteries are lighter and have a power density, which will in return make our system lighter and last longer. It will also cut down our development costs as we don’t have to purchase batteries all the time. To address the possible (although unlikely) overheating of the battery, we will monitor the power consumption and temperature of the lipo batteries as we develop the system. We will also add an insulator layer to our glove if necessary.

 

  • Provide an updated schedule if changes have occurred.

There was no change to our schedule.

 

  • Component that we got to work.

Oscar continued working on the synthesizer and now it supports real time pitch and volume control and timbre selection. Here is a short demo video.

 

  • Adjustment to team work to fill in gaps related to new design challenge

Although Oscar was in charge of developing the synthesizer, he encountered a major sound card issue that couldn’t be resolved when developing on Ubuntu. He then switched to Windows and wrote everything from scratch using new libraries. In order to make sure the resulting product is still macOS compatible, Karen and Yuqi (both owning a MacBook) both jumped in and conducted a preliminary setup experiment. After changing the code a bit and installing the correct version of libraries, our synthesizer is now both Windows and macOS compatible.

 

 

Karen’s Status Report Feb 25

  • What did you personally accomplish this week on the project? Give files or
    photos that demonstrate your progress.

This week I created a python file that creates a pop up window showing the possible commands and music effects that gestures can control. So far no actual mapping is done because we did not come up with actual gestures we are using yet. I end up using pygame because it supports a more fancy ui compared to pysimpleGUI and should be better for the users to use.

  • Is your progress on schedule or behind? If you are behind, what actions will be
    taken to catch up to the project schedule.

I am a little behind process because I did not get the data saving part to work. every time the mapping will just reset, which is not good for usability. I am planning read some python documentation and tutorials to see if this function is possible and switch to a website if this is not possible.

  • What deliverables do you hope to complete in the next week?

Next week I hope to get the scroll bar and data saving to work in the interactive part, and get the gestures that we are going to use confirmed after looking at the models and their datasets.

 

Oscar’s Status Report for Feb 25

  • What did you personally accomplish this week on the project? Give files or
    photos that demonstrate your progress.

This week I continued to improve on the oscillator program I wrote last week to support pitch and volume control and timbre selection. By moving the mouse left and right, the program plays one of the eight notes between C4 and C5. The volume is controlled by the vertical position of the mouse. Also, the program will produce a different sound when the oscillator is initialized with different functions. Here is a demo video of the functionalities above.

I also spent quite a lot of time making my system both Windows and macOS compatible. Since I do not own a MacBook, Karen helped me make sure that my program is indeed macOS compatible. Previously I had some issues with using the simpleaudio library on Windows. This was resolved by using the pyaudio library instead. However, switching libraries means I have to rewrite the core part of my synthesizer from scratch.

  • Is your progress on schedule or behind? If you are behind, what actions will be
    taken to catch up with the project schedule.

I think my progress is on schedule. However, using an existing synthesizer software is always a backup plan. I have researched a little bit into open-sourced synthesizers this week and decided that Surge is currently the best available option.

  • What deliverables do you hope to complete in the next week?

I plan to continue upgrade the linear volume control into a vibrato. Also, I will try to add a gliding mechanism to my synthesizer so that it smooths out the not transition.

Karen’s Status Report for Feb 18

  • What did you personally accomplish this week on the project? Give files or
    photos that demonstrate your progress.

Personally I did not accomplish a lot this week. This week I looked at tutorials on how to develop an app on different OS, and how to set up an interactive  webstie that stores user input with cookies. I also created a list of backup models for our recognition function in case our first choices does not work as intended.

  • Is your progress on schedule or behind? If you are behind, what actions will be
    taken to catch up to the project schedule.

I think I am very behind schedule this week because of personal health issues. In order to make up for the time missed this week I will be spending the entire weekend this week and next week working on capstone related topics 🙂

  • What deliverables do you hope to complete in the next week?

Next week I hope to get the interactive app/website setup so we can pick a gesture from an existing list and link it to a sound defined by pitch in hz / and pick a tilt angle from an existing list and link it to a specific volume level in db. Currently it will not support the part about actually producing the sound, but it should be able to allow gesture mapping and selection, and will save the selected results when the app/website opens next time.

  • ECE courses (if any) that covered the engineering science and mathematics principles your team used to develop your design

Not really? Most of the basic engineering design concepts I get are from my CEE course in year 1 and my ECE coverage course 17313. Also in the following weeks our group have searched up these terms on google.

 

Yuqi’s Status Report for Feb 18

1, What did you personally accomplish this week on the project? 

I decided the tilt sensor that we are going to use on our wearable circuits. It is #1528-1011-ND from Digi-Key. the link is: https://www.digikey.com/en/products/detail/adafruit-industries-llc/1018/4990760?utm_adgroup=Evaluation%20Boards%20-%20Sensors&utm_source=google&utm_medium=cpc&utm_campaign=Shopping_Product_Development%20Boards%2C%20Kits%2C%20Programmers_NEW&utm_term=&utm_content=Evaluation%20Boards%20-%20Sensors&gclid=Cj0KCQiAi8KfBhCuARIsADp-A54GIzLMGCKdVFVORmIE_QT5jlwPzKf2_jBcrjMH_rA2ZLjRv3hZEWkaAlCIEALw_wcB

I also discussed with my teammates on Monday to design or project. We decided the gestures our product will need to recognize. We decided to put wearable circuits on both of our hands and use Arduino to send data to the computer.

2,  Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule? What deliverables do you hope to complete in the next week?

It is on schedule. After the design presentation next week and our design is approved, I will go to buy the sensors on our item list and figure out how these sensors work when they arrive.

3, Please list the particular ECE courses (if any) that covered the engineering science and mathematics principles your team used to develop your design ? If there are no courses please tells us how you learned those principles over the last week ?

I focus on the circuit part of our project. The ECE courses that are relevant and helpful to the design are 18220: Electronic Devices and Analog Circuits and 18320: Microelectronic Circuits.

Team Status Report for Feb 18

Team Reports for 2/18

  • What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

We still believe the most significant risk is the hand gesture + movement detection system. As a result, we decided to use a demux structure so that our program will either perform gesture recognition or hand tracking. The video feed will be fed into two time-multiplexed models, one for gesture recognition and one for hand tracking. Splitting the functionality allows us to access more pretrained models and hand tracking program, which greatly reduces the development time. An abstract block diagram is shown below:

  • Were any changes made to the existing design of the system? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

Instead of training our own gesture recognition model we decide to use a pre-existing one because of time constraints. Since we are not at the step yet this design change did not create any extra cost for us. We also included backup options for most of the parts that we plan to write on our own in case something goes wrong. We believe that risk mitigation is necessary and it is always better to have a plan b on everything.

 

  • Provide an updated schedule if changes have occurred.

There was no change to our schedule. After factoring spring break and other outside events, the schedule is still reasonable for everyone.

 

  • Component that we got to work.

Oscar wrote a basic, cursor-controlled synthesizer that supports real-time playing of 2 notes. The frequency value of those notes were generated lazily using iterator functions and OOP.

 

  • The “keep it simple” principle

The major decision we had last week was to split the gesture control part and hand tracking part into two time-multiplexed programs. That way, we can easily develop the two parts concurrently and make best use of any pretrained/existing programs. There are two modes of our instrument, a sound selection mode (where the user can use gestures to select a particular timbre), and a play mode (where the user plays the instrument with hand positions). We will also include a software demux that allows users to freely switch between these two modes.

Oscar’s Status Report for Feb 18

  • What did you personally accomplish this week on the project? Give files or
    photos that demonstrate your progress.

This week I continued to improve on the oscillator program I wrote last week so that it supports mouse-controlled real-time playing. When run, pyautogui will first get the cursor position. If the cursor is on the left side of the screen, the program plays a C4 indefinitely. If the cursor is on the right, the program plays a E4 indefinitely. Here is a short demonstration video of it working.

I spent quite a lot of time fixing the “snaps” when looping clips of a frequency. First I thought it can be fixed by trimming the generated frequency array so that the difference between the first element and the last is relatively small. However, the snap only got a little bit better. I then realized I have to reduce the software overhead (actual gap in time) of repeating a frequency array, so I tried to find a python library that supports looping with the lowest overhead. I have tried packages like pyaudio, simpleaudio, and pymusiclooper. However, since most of these libraries loop a wavfile on an infinite loop, the gap between each repeat is still noticeable. In the end, I used a library called sounddevice that implements looping using callbacks, and since there is virtually no gap between repeats, looping is now unnoticeable.

  • Is your progress on schedule or behind? If you are behind, what actions will be
    taken to catch up to the project schedule.

I think my progress is on schedule. However, using an existing synthesizer software is always a back up plan. In that case, I only need to figure out how to change parameters through my own program instead of its GUI.

  • What deliverables do you hope to complete in the next week?

I plan to add two more functionality to my oscillator program: real-time cursor-based volume control, and real-time vibrato. I plan to implement the vibrato using a envelope that performs amplitude modulation on the generated frequency array.

  • Particular ECE courses that covered the “keep it simple” principle

The most important thing I learned last week was to keep everything simple. After experimenting with looping frequency arrays, I found that the library that works the best are the ones that have the fewest lines of code to write. I don’t remember classes that explicitly covered this principle, but it is a theme in most programming courses like 18-213(i.e. keep your code short and simple).

Team Status Report for Feb 11

The most significant risk is the hand gesture/movement detection system. Since we plan to use our own data set and train our network, it could be more time-consuming than expected. As a contingency plan, we found two pre-trained gesture recognition models online: Real-time-GesRec and hand-gesture-recognition-mediapipe. However, these models do support more gestures, so we will only use a subset that maximizes the detection accuracy.

So far there are no changes made to the existing general design of the system. However, we did introduce a plan B to some of the subparts in the system. For example,  we are considering the possibility of using a pre-trained model in case we are not able to collect enough data in a short period of time and train our own. This a necessary precaution to ensure that a problem in one part of the process would not cause the overall hand gesture recognition component of the system to not function properly.

No updates to the current schedule have been created. Currently, we are still in the process of designing our implementation and finalizing design details.

Although we are still in the research and design stage, we did come across a DIY Python synthesizer tutorial that actually mimics the theremin quite well.

Our project includes environmental and economic considerations. We added an autonomous lighting system so that our instrument can function in a very dim environment. We also want our end product to be significantly cheaper than theremin on the market, which limits our budget to be under $400 dollars.

Karen’s Status Report for Feb 11

  • This week I spent most of the time researching existing hand gesture recognition models (and part of its corresponding database) to observe the kind of items it consist and reevaluate the practicality of collecting our own data. In case we are not able to collect and train our own data, I want to find a list of practical existing models that we can pick from.
  • This week I think I am pretty much on schedule in terms of researching for project implementation and design for the first half of the week, then I got covid so there is not as much progress as I would like in the later half of the week
  • Next week (Feb 19) I am planning on finishing our project design proposal presentations slides with my group. I will be preparing and delivering the design proposal presentation (presentation 2) for my group.