Karen’s Status Report for Feb 18

  • What did you personally accomplish this week on the project? Give files or
    photos that demonstrate your progress.

Personally I did not accomplish a lot this week. This week I looked at tutorials on how to develop an app on different OS, and how to set up an interactive  webstie that stores user input with cookies. I also created a list of backup models for our recognition function in case our first choices does not work as intended.

  • Is your progress on schedule or behind? If you are behind, what actions will be
    taken to catch up to the project schedule.

I think I am very behind schedule this week because of personal health issues. In order to make up for the time missed this week I will be spending the entire weekend this week and next week working on capstone related topics 🙂

  • What deliverables do you hope to complete in the next week?

Next week I hope to get the interactive app/website setup so we can pick a gesture from an existing list and link it to a sound defined by pitch in hz / and pick a tilt angle from an existing list and link it to a specific volume level in db. Currently it will not support the part about actually producing the sound, but it should be able to allow gesture mapping and selection, and will save the selected results when the app/website opens next time.

  • ECE courses (if any) that covered the engineering science and mathematics principles your team used to develop your design

Not really? Most of the basic engineering design concepts I get are from my CEE course in year 1 and my ECE coverage course 17313. Also in the following weeks our group have searched up these terms on google.

 

Yuqi’s Status Report for Feb 18

1, What did you personally accomplish this week on the project? 

I decided the tilt sensor that we are going to use on our wearable circuits. It is #1528-1011-ND from Digi-Key. the link is: https://www.digikey.com/en/products/detail/adafruit-industries-llc/1018/4990760?utm_adgroup=Evaluation%20Boards%20-%20Sensors&utm_source=google&utm_medium=cpc&utm_campaign=Shopping_Product_Development%20Boards%2C%20Kits%2C%20Programmers_NEW&utm_term=&utm_content=Evaluation%20Boards%20-%20Sensors&gclid=Cj0KCQiAi8KfBhCuARIsADp-A54GIzLMGCKdVFVORmIE_QT5jlwPzKf2_jBcrjMH_rA2ZLjRv3hZEWkaAlCIEALw_wcB

I also discussed with my teammates on Monday to design or project. We decided the gestures our product will need to recognize. We decided to put wearable circuits on both of our hands and use Arduino to send data to the computer.

2,  Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule? What deliverables do you hope to complete in the next week?

It is on schedule. After the design presentation next week and our design is approved, I will go to buy the sensors on our item list and figure out how these sensors work when they arrive.

3, Please list the particular ECE courses (if any) that covered the engineering science and mathematics principles your team used to develop your design ? If there are no courses please tells us how you learned those principles over the last week ?

I focus on the circuit part of our project. The ECE courses that are relevant and helpful to the design are 18220: Electronic Devices and Analog Circuits and 18320: Microelectronic Circuits.

Team Status Report for Feb 18

Team Reports for 2/18

  • What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

We still believe the most significant risk is the hand gesture + movement detection system. As a result, we decided to use a demux structure so that our program will either perform gesture recognition or hand tracking. The video feed will be fed into two time-multiplexed models, one for gesture recognition and one for hand tracking. Splitting the functionality allows us to access more pretrained models and hand tracking program, which greatly reduces the development time. An abstract block diagram is shown below:

  • Were any changes made to the existing design of the system? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

Instead of training our own gesture recognition model we decide to use a pre-existing one because of time constraints. Since we are not at the step yet this design change did not create any extra cost for us. We also included backup options for most of the parts that we plan to write on our own in case something goes wrong. We believe that risk mitigation is necessary and it is always better to have a plan b on everything.

 

  • Provide an updated schedule if changes have occurred.

There was no change to our schedule. After factoring spring break and other outside events, the schedule is still reasonable for everyone.

 

  • Component that we got to work.

Oscar wrote a basic, cursor-controlled synthesizer that supports real-time playing of 2 notes. The frequency value of those notes were generated lazily using iterator functions and OOP.

 

  • The “keep it simple” principle

The major decision we had last week was to split the gesture control part and hand tracking part into two time-multiplexed programs. That way, we can easily develop the two parts concurrently and make best use of any pretrained/existing programs. There are two modes of our instrument, a sound selection mode (where the user can use gestures to select a particular timbre), and a play mode (where the user plays the instrument with hand positions). We will also include a software demux that allows users to freely switch between these two modes.

Oscar’s Status Report for Feb 18

  • What did you personally accomplish this week on the project? Give files or
    photos that demonstrate your progress.

This week I continued to improve on the oscillator program I wrote last week so that it supports mouse-controlled real-time playing. When run, pyautogui will first get the cursor position. If the cursor is on the left side of the screen, the program plays a C4 indefinitely. If the cursor is on the right, the program plays a E4 indefinitely. Here is a short demonstration video of it working.

I spent quite a lot of time fixing the “snaps” when looping clips of a frequency. First I thought it can be fixed by trimming the generated frequency array so that the difference between the first element and the last is relatively small. However, the snap only got a little bit better. I then realized I have to reduce the software overhead (actual gap in time) of repeating a frequency array, so I tried to find a python library that supports looping with the lowest overhead. I have tried packages like pyaudio, simpleaudio, and pymusiclooper. However, since most of these libraries loop a wavfile on an infinite loop, the gap between each repeat is still noticeable. In the end, I used a library called sounddevice that implements looping using callbacks, and since there is virtually no gap between repeats, looping is now unnoticeable.

  • Is your progress on schedule or behind? If you are behind, what actions will be
    taken to catch up to the project schedule.

I think my progress is on schedule. However, using an existing synthesizer software is always a back up plan. In that case, I only need to figure out how to change parameters through my own program instead of its GUI.

  • What deliverables do you hope to complete in the next week?

I plan to add two more functionality to my oscillator program: real-time cursor-based volume control, and real-time vibrato. I plan to implement the vibrato using a envelope that performs amplitude modulation on the generated frequency array.

  • Particular ECE courses that covered the “keep it simple” principle

The most important thing I learned last week was to keep everything simple. After experimenting with looping frequency arrays, I found that the library that works the best are the ones that have the fewest lines of code to write. I don’t remember classes that explicitly covered this principle, but it is a theme in most programming courses like 18-213(i.e. keep your code short and simple).

Team Status Report for Feb 11

The most significant risk is the hand gesture/movement detection system. Since we plan to use our own data set and train our network, it could be more time-consuming than expected. As a contingency plan, we found two pre-trained gesture recognition models online: Real-time-GesRec and hand-gesture-recognition-mediapipe. However, these models do support more gestures, so we will only use a subset that maximizes the detection accuracy.

So far there are no changes made to the existing general design of the system. However, we did introduce a plan B to some of the subparts in the system. For example,  we are considering the possibility of using a pre-trained model in case we are not able to collect enough data in a short period of time and train our own. This a necessary precaution to ensure that a problem in one part of the process would not cause the overall hand gesture recognition component of the system to not function properly.

No updates to the current schedule have been created. Currently, we are still in the process of designing our implementation and finalizing design details.

Although we are still in the research and design stage, we did come across a DIY Python synthesizer tutorial that actually mimics the theremin quite well.

Our project includes environmental and economic considerations. We added an autonomous lighting system so that our instrument can function in a very dim environment. We also want our end product to be significantly cheaper than theremin on the market, which limits our budget to be under $400 dollars.

Karen’s Status Report for Feb 11

  • This week I spent most of the time researching existing hand gesture recognition models (and part of its corresponding database) to observe the kind of items it consist and reevaluate the practicality of collecting our own data. In case we are not able to collect and train our own data, I want to find a list of practical existing models that we can pick from.
  • This week I think I am pretty much on schedule in terms of researching for project implementation and design for the first half of the week, then I got covid so there is not as much progress as I would like in the later half of the week
  • Next week (Feb 19) I am planning on finishing our project design proposal presentations slides with my group. I will be preparing and delivering the design proposal presentation (presentation 2) for my group.

Yuqi’s status Report for Feb 11

1, What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress.

I searched the tilt sensor that will be placed in the wearable sensor and learnt how the sensors work.  I learnt that the tilt sensors measure the tilting position by comparing it to gravity. I chose some of the tilt sensor that we can use in our project:

  • https://www.robotshop.com/collections/tilt-sensors
  • https://www.amazon.com/Accelerometer-Acceleration-Gyroscope-Electronic-Magnetometer/dp/B07GBRTB5K/ref=asc_df_B07GBRTB5K/?tag=hyprod-20&linkCode=df0&hvadid=369237808685&hvpos=&hvnetw=g&hvrand=3458985732392896189&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9005925&hvtargid=pla-817395333973&psc=1&tag=&ref=&adgrpid=79082325569&hvpone=&hvptwo=&hvadid=369237808685&hvpos=&hvnetw=g&hvrand=3458985732392896189&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9005925&hvtargid=pla-817395333973
  • https://vetco.net/products/wearable-sewable-tilt-switch?gclid=CjwKCAiAlp2fBhBPEiwA2Q10D31R4lUmEcKBun0E7ZGFVJVokNfWckIVu_tFpU3v0EWkmOUsR1YJaxoCD1gQAvD_BwE

2,  Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

It is on schedule. My concentration is on Circuits and will focus on the circuit designing in the following week.

3, What deliverables do you hope to complete in the next week?

We will do further research on these sensors and choose one. After choosing the sensor, I will design the circuit and learn how to use the sensor. (how to save the data and upload it into Arduino.)

Oscar’s Status Report for Feb 11

Besides preparing for the presentation on Wednesday, I was researching potential software/code for our synthesizer. Specifically, I was looking for the following functionalities:

  • Supports playing sounds of different frequencies and volumes.
  • Can apply at least one adjustable musical effect.
  • Supports at least two different tone colors/instrument sounds.

I have tried building a DIY synthesizer using Python libraries like pyo, synthesizer, and C++ libraries like STK. After wrestling with import errors and dependency issues, I decided to build a synthesizer based on this post. It has a step-by-step tutorial that teaches you how to build an oscillator, a modulator, and a controller in Python. I modified the code provided in the post and wrote a simple Python program that can generate numpy arrays representing arbitrary frequencies sounds. Currently, the program plays a “C4-E4-G4” sequence on repeat.

My progress is on schedule, and I am planning to upgrade the synthesizer program so that it responds to real-time inputs (for example, mouse movement). Also, I need to fix the audio “snaps” problem. When looping a frequency array, the difference between the first and last element will cause the synthesizer to play a snap sound. I need to find a way to smooth out the snaps and make looping unnoticeable.