Jordan’s Status Report for 4/6

This week, I finished the assembly of the fingering collection system. All sensors are installed and tested, and now all that is left is software. Testing is also completed, and performance is within the use case requirements.

For next week, I will start working on the integration with the rest of the system. I will need to map all sensor data to actual fingering data for the web server, as well as developing a way to coordinate the recording of both fingering and audio together for the system to analyse. The biggest issue is synchronisation, as the audio and fingering are collected on different devices, and the timing will be important. All of this is based on the discussion I had with my team earlier this week, and we finalised on the specific format of the outputs.

I am still on track, according to last week’s schedule. My individual part is mostly complete, now it is just integration.

Junrui’s Status Report for 4/6

This week our team discussed about the modifications we need to make to each individual part to move towards integration. I changed the logic of my app to let the user play after clicking on start, and click on replay to display the diagrams and feedback generated from the fingerings and pitch data in the practice session. Since the expected outputs from our pitch detection system and fingerings system contain timestamps of start and end for each fingering/note,  I was able to tolerate a mismatch for 0.5s in the synchronized reference data and user data to allow users play in a way that is not perfectly aligned with the reference.

I am on track this week. Next week I will test with the actual results from other parts to see if the web app will function as expected, and also check whether the 0.5s threshold is suitable for real-world data.

Jordan’s Status Report for 3/30

This week, I mainly focused on a change of design. I changed the cable used to connect the sensor to the mainboard from normal wires to DuPont-style connectors, which are easier to work with, as well as easier disassembly. I measured the distances between the keys and the board, as well as estimated the magnet location to ensure proper detection. Assembly onto the saxophone will begin once the new wires arrive, and this will be my task for next week.

The software is also coming along, and as a team we will also decide the format of transmission by next week. Right now, the plan is to send both the raw fingering data and the note information to the web app, but this will be finalized next week.

Schedule changes are reflected in the team report.

Team Status Report for 3/30

Our major concern for now is that the subsystems are not yet finalized but the demo is next Monday. Lin is still working on code for pitch detection and Junrui is still finishing implementation for the webapp practice page. We’ll do what we can to make our part perfect. Another concern is that we might not have enough time to integrate each subparts together. We’re still discussing what platform to use for webapp deployment. After we finish the demo, we’ll start working on integration immediately.

We have updated our schedule as in the graph. There’s not much change compared to the original one, but we postponed the integration process by one week.

Lin’s Status Report for 3/30

This week I mainly work to finalize the audio processing part to prepare for the coming demo. I keep working on the integration of rhythm and pitch. I have tested an input audio recorded by Jordan, and I noticed some issues. The major issue is that the environment noises cause some pitch detection inaccuracy, and it seems that the current band-pass filter I applied doesn’t filter out the noises. I’m still working to revise the code to reduce the effect of noises. 

I am on track of schedule. I’ll keep working on my subpart to finalize it before the demo and start integrating the system with webapp frontend once we finish the demo.

Junrui’s Status Report for 3/30

This week I continued to work on my web app part. As I mentioned in the meeting, I was struggling with the misplacement of diagram parts and the strange display of the user interface. Since I had exams on Wednesday and interviews after that, I didn’t have much time to fix the problem. Currently the problem still persists, and I am considering researching some graphic libraries to solve it.

Since our group has decided a new schedule, I am almost on track this week. Next week is the interim demo week, and after the interim demo, I am planning to work with my teammates to start the integration.

Team’s status report for 3/23

The most significant risks we face now is that the integration may start later than we planned in schedule, so if some problems occur in integration part, we may not have enough time to fix them. The contingency plan for this is to ensure the systems function after integration, but lower the metrics we need to reach. Also, we try to refine each component to make sure that they can work independently, and we believe that it will greatly reduce the burden of debugging the mistakes in integration.

There is no change made to our existing design this week.

Junrui’s status report for 3/23

This week I managed to display different text and diagrams based on different inputs. I hardcoded the user input into a vector, where 0 indicates a key is unpressed and 1 indicates the key is pressed, and the diagram and feedback to  correct fingering would change based on this vector. Currently the practice page is almost finished.

I am on track this week. Next week I am planning to link the database and implement the history statistics, and also check local hosting methods. If time permits, I will work with Lin to start integration with the pitch detection system.

Lin’s Status Report for 3/23

This week I worked on the integration of rhythm processing and pitch detection. It takes the output array of rhythm, which is a list of 0s and 1s, and the output array of pitch, which is a list of music notes. And then it pairs the note with the rhythm and stores the result in a dictionary. I also worked on the pre-processing of the input audio by adding a band-pass filter to it to make sure the signal is in the range of 300hz- 2500hz (this value is still under testing and may change).

I am on track of schedule. Next week, I’ll start testing the system with real-life saxophone input played by Jordan. I will see if it meets the accuracy requirement we aimed for in the proposal and make adjustments to my code based on the result.

Jordan’s status report for 3/23

This week, I completed more soldering work for the saxophone fingering system. The soldering of the sensors took a longer than expected due to the small size of the sensor, and getting the PCB for the sensor into its small shape also took longer with a minor injury as well. Due to this, the completion of the system will be pushed back one week to two weeks from now. However, I have also completed testing of all work done, which contributed to the delay, but it means less testing down the line. Specifically, I made sure that each sensor worked, and that the response time was within spec. The response time is not noticeable for typical use case, and whatever delay there is will not impact user experience.

For next week, I aim to finish installing the magnets and sensors onto the saxophone.