Team Status Report for 10/26/24

Team Status Report for 10/26/2024

This week was mostly spent on individual work. Fiona worked on the UI responses to update the MIDI file in the system, locating a MIDI to sheet music conversion software and mapping coordinates to responses in the code. Shravya worked on the MIDI-to-firmware conversion code, and devised small-scale testing plans to ensure functionality of components that will arrive this week. Peter worked on the 3D model for the box which will hold the components of our product.

More specifics are in the individual progress reports.

Our current major risk is that we are behind schedule, but we allocated slack time at the beginning of the semester for this scenario.

Next week the group will work on the ethics assignment together by meeting to discuss our responses. 

Peter’s Status Report for 10/26/24

This Week

This week was mainly spent coming up with measurements for and modeling the 3D case for our product, see Figure 1. Part of this involved modeling the Adafruit 412 solenoids, see Figure 2, in solidworks, since models could not be found online. The length of travel of the solenoid was initially a concern, since it is only 4mm and a keyboard’s keys have a travel length of 10mm. However, by positioning the solenoids on the keyboard such that the keys are always depressed by 5mm, the solenoids are able to activate the piano keys with only 4mm of travel. In short, the ability of the Adafruit 412 solenoids to depress the piano keys enough to activate the keys is not a concern.

 

Figure 1: Solenoid case

 

Figure 2: Adafruit 412 3D model

 

Next Week

Next week, if the rest of the materials arrive, I will test our solenoid circuitry with Shravya. Otherwise, I will be solely focused on working on implementing the eye tracking software.

Fiona’s Status Report for 10/26/2024

This Week

This week, I worked on the assigned reading and write-up for the ethics assignment.

MIDI File Responses

I also started working on the backend for the MIDI file saving and editing. I first tested with the Mido Python library [1], which has functions for MIDI file editing. Using the frontend framework I made with Tkinter a few weeks ago, I was able to write a sequence of note pitches and lengths into a MIDI file by button press. In order to do so, I found a MIDI file spec that details the way the files are formatted and how to designate pitch and tempo [2].

One issue I am struggling with right now for the MIDI file saving backend is determining the best way to allow the user to edit the MIDI file name and location, while still keeping our UI simple and easy to use. I will continue to work on that problem.

Secondary UI

This week, I also identified some code we may be able to use in order to convert the MIDI file data into sheet music [3]. This code is a simplified version of another project [4], but I might try to simplify it further for our project.

User Interface

I made some minor edits to the user interface framework this week. For the time being, I am continuing to code the UI using the Python library Tkinter, but if I run into trouble later on, I plan to try using Django.

Mapping Coordinates to the UI

The last thing I started working on this week was the code that maps eye-coordinates to command responses on the UI. Previously, I had been testing UI responses with a button press, since we have not finished implementing eye tracking yet, but once we do, I want to have some code finished to connect the eye-tracking to the UI responses.

Next Week

The MIDI file saving is not complete: I still need to add file saving/opening functionality, and the functionality to move the cursor backwards and forwards in the piece.

Next week, I will also begin working on the UI responses (e.g., something to indicate that the user has pressed a button, error messages, etc.), which means I will need to do more work to finalize the UI.

I also plan to begin integrating the MIDI to sheet music conversion code with ours next week and continue working on mapping coordinates to responses.

I am still behind schedule this week, but I have made considerable progress in the areas that I wanted to.

References

[1] Overview. Mido – MIDI Objects for Python. https://mido.readthedocs.io/en/stable/index.html

[2] Standard MIDI-File Format Spec. 1.1, updated. McGill. http://www.music.mcgill.ca/~ich/classes/mumt306/StandardMIDIfileformat.html

[3] BYVoid. (2013, May 9) MidiToSheetMusic. GitHub. https://github.com/BYVoid/MidiToSheetMusic

[4] Vaidyanathan, M. Convert MIDI Files to Sheet Music. Midi Sheet Music. http://midisheetmusic.com/

 

Shravya’s status report for 10/26/2024

This week I worked on my ethics assignment, spent some time with my teammates processing the feedback we got from staff regarding our design report, worked on my MIDI to firmware conversion code, and devised some small-scale testing plans to ensure the components we ordered will work correctly.

Initial components testing

I am waiting for some more components (for the solenoid control circuitry) to arrive this week. Once they do, I will probably spend an evening running tests such as the following:

 

Test the Solenoid

  • Activation Test: Connect the solenoid directly to the power supply and confirm it activates within the specified current range.
  • Power Consumption: Measure the actual current draw and compare it with specifications.
  • Thermal Testing: Run the solenoid for an extended period and monitor for any significant heating to ensure it can sustain operation without overheating.
  • Measure any delays in reaction time

 

Test flyback diode reliability

  • With the diode installed, use an oscilloscope to measure voltage at the MOSFET drain (or across the solenoid) when the circuit is deactivated. Take note of the voltage spike when switching off power to the solenoid and note how effectively the diode suppresses back EMF.
  • One piece of feedback I received from the design report on a section I wrote was that the equation I used for energy dissipation wasn’t extremely necessary to include and rather I should have included a Vspike equation to quantify the back-voltage. I agree with this feedback, and when I run physical tests I will take note of the Vspike and how that may affect other quantities; I will include this in the final report.

 

Test the MOSFET

  • Switching Test: Use a low-frequency signal from the function generator to control the MOSFET, checking that it switches the solenoid on and off without issues.
  • Thermal Stability: Measure temperature over a few switching cycles. Confirm that the MOSFET consistently stays within a safe temperature range

 

My progress on MIDI to firmware conversion code

I have gotten a grasp on how to parse MIDI with Python’s Mido library to extract note events. I am also able to generate C-compatible data to output an array of structs in a C header file. I will keep fine-tuning this as necessary and this week I will also work on writing the final “integration” code, as in, using the C array to send commands from the STM32 to actuate the solenoids. I have attached a screenshot of some of my code.

I am on schedule. I hope to have the bulk of my components testing finished this week as well as a bare-bones functioning midi-to-firmware conversion code (I want it to function save for any edge cases, minor bugs, etc). I will also be working more on the ethics assignment this week after having pod discussion in lecture.

Peter’s Status Report for 10/20/2024

This Week

This week was spent working on the design report. Beyond the design report, I have begun working on the 3D printed solenoid case.

Fall Break

I started doing further research into how to implement eye-tracking. I was planning on completing more over Fall Break, but I was occupied with other work.

Next Week

Currently my progress is behind schedule. To catch up, I plan on focusing on eye-tracking implementation next week and, if the solenoids have arrived, testing the solenoids to ensure they work with our design. If they do, we will order enough for the full octave. I hope to have a rough eye-tracking implementation completed by the end of next week. I will pause work on the 3D printed solenoid case until after the solenoids have been confirmed to work for our design.

Team Status Report for 10/20/2024

As a team, we spent most of Week 7 working on our design report. This was a time intensive experience, as we had to do a lot of research, and discuss within our group facets of our design, testing and purpose, as well as how to convey all of that information efficiently and understandably in our report. 

Because of this, our main risk right now is that we have fallen behind schedule. We had planned to have made further progress with the eye-tracking software and application at this point in the semester. However, we did allot weeks 12-14 and finals week for slack time, so we are hopeful that we have ample time to catch up and complete the tasks we have set for ourselves.

While working on the design report, we updated our Gantt chart, see below.

Global Considerations, Written by Shravya

Our solution goes beyond local or technological communities. Globally, there are millions of people living with disabilities (particularly, disabilities relating to hand/arm mobility) who may not have access to specialized instruments or music creation tools tailored to their specific needs. These people exist across various socio-economic and geographical (urban vs remote) contexts. Our solution offers a low-cost, technologically accessible means of composing and playing music, making it a viable option not just in academic or well-funded environments, but also in regions with limited access to specialized tools. By providing compatibility with widely used MIDI files, minimal physical set-up, and an eye-tracking interface we aim to make as user-intuitive as possible, users around the world will be able to express themselves musically without extensive training or high-end technology. 

Cultural Considerations, Written by Fiona

Music is an important part of many cultural traditions. It is a tool for communication, and allows people to share stories and art across generations and between cultures. For example, many countries have national anthems that can be used to communicate what is important to that country. Broader access to music is thus important to many people, because it would allow them to participate in their culture or others, if they wished. Recognizing the importance of music for many individuals and groups, we hope that our project can be a stepping stone for more accessibility to musical cultural traditions.

Environmental Considerations, Written by Peter

By utilizing laptop hardware which is likely to already be owned by users, we are able to reduce the amount of electronic waste, which can be toxic and nonbiodegradable [1], that could be created by our product. Along with this, by working to minimize our products’ power consumption, we are minimizing our products’ contribution to pollution that results from non-renewable energy sources.

References

[1] Geneva Environment Network. (2024, October 9). The Growing Environmental Risks of E-Waste. Geneva Environment Network. https://www.genevaenvironmentnetwork.org/resources/updates/the-growing-environmental-risks-of-e-waste/ 

Shravya’s Status Report for 10/20/2024

Week before fall break

All of my time this week was focused on completing the design report. We had meetings on 4 of the days and there was plenty of individual work on top of that.

Fall break

Over fall break, I was extremely occupied with a project and homework for my 18421 class.

Next week

I am waiting on a preliminary order of solenoid circuit components, and as soon as they arrive, I can expect to spend a few hours testing them out with a power supply and oscilloscope. Testing them out physically will prove more fruitful than Cadence simulations, I’ve come to realise- Cadence is too ideal, and it takes time for me to introduce parasitics to make components behave more realistic. It’s simply not a good use of my time when I could be working with the real components, and Joshna and Professor Bain agreed with me on this in our most recent meeting.

This week I will also start writing code for MIDI to firmware conversion using the python MIDO library.

I have indeed fallen slightly behind schedule but I am confident I can get back up to speed this week. Besides, we had allocated about 2 weeks of slack time at the end of the semester in our Gantt chart, some of which will help me right now.

Fiona’s Status Report for 10/20/2024

This Week

All of my time this week was spent working on the design report. I researched eye-tracking in general, and when used for musical composition. I was able to find quite a few other projects with similar goals and/or functionalities to ours that we can use to inform our project and it’s goals [1], such as EyeHarp [2], E-Scape [3][4], and an academic project called EyeMusic [5]. All of these, and their relation to our project I described more in depth in the design report.

Other than the research and work I did for the introduction and related works, I also worked on several other sections of the report, including the use case and design requirements, (software) system implementation, testing, risk mitigation, and summary sections.

Fall Break

I planned to catch up on some of my previous tasks during Fall break. Specifically, I wanted to work on the frontend and determine how to map eye-coordinates to commands. However, I was unable to due to other commitments.

Next Week

Since I have fallen behind in our Gantt chart schedule, my main goal this week will be catching up on my work. To catch up, I will continue working on the frontend, which will include the deciding whether to continue coding with Tkinter, or to switch to Django, as was suggested to us by one of our advisors. Also, I will begin work on mapping eye coordinates to commands.

There are also a couple of other tasks I need to start working on this week: create responses to commands that would update the MIDI file, and integrating an existing MIDI to sheet music program with our system. Further, the ethics assignment is due this week, so I will have to work on that.

This is a fair amount of work to do in one week, so I don’t expect to finish all of these tasks completely, but I will start on all of them so I have an idea of how long they will take overall, and perhaps adjust the Gantt chart from there.

References

[1] O’Keeffe, K.. (2020, Feb 20). Eyegaze for Musical Expression. Assistive Technology and Me. https://www.atandme.com/eyegaze-for-musical-expression/

[2] FAQs. EyeHarp. https://eyeharp.org/faq/

[3] Anderson, T. Eyegaze music. Inclusive Music. https://www.inclusivemusic.org.uk/using-e-scape-with-eyegaze/

[4] Drake Music. (2014, Nov 24). EyeGaze composing with E-Scape. Vimeo. https://vimeo.com/112689731

[5] Hornof, A. J., & Sato, L. (2004). EyeMusic: Making Music with the Eyes. Proceedings of the International Conference on New Interfaces for Musical Expression, 185–188. https://doi.org/10.5281/zenodo.1176613

Peter’s Status Report for 10/05/2024

This Week

The majority of this week was spent preparing for the design review presentation. As part of this preparation, Shravya and I went over our designs for the control circuit for the solenoid control, and concluded that her design using an NMOS for low-side switching would be best for our design. If we used a PMOS, the design could be shifted to work as a high-side switch, but we decided to go with the circuitry more-closely resembling the low-side switch presented in the Adafruit 412 design document [1].

Shravya and I also reviewed our designs for the power regulation. We decided to use a 12V DC power supply adaptor that plugs into a wall outlet to be the main source of power to our solenoids and the Nucleo32 board. We will also use a 3.3V power regulator to decrease the voltage from 12V to 3.3V for an input to the Nucleo32 board.

Currently, my progress is behind schedule. Implementation of eye-tracking to identify major sections of a screen a user is looking at (unprecise eye-tracking implementation) has not been developed yet. Additionally, the design for the 3D printed solenoid case, which should have been done to make up for the solenoids not being able to be tested this week, has not been made yet either. To make up for this, the design for the 3D printed solenoid case will be completed on Sunday (October 6th, 2024), and the implementation of unprecise eye-tracking will be worked on this upcoming week. If implementation of unprecise eye-tracking is not completed this upcoming week, then I will continue to work on it over Fall Break (October 13th, 2024 to October 19th, 2024).

 

Next Week

This upcoming week, I will complete the first design for the 3D printed solenoid case and will begin work on implementing the eye-tracking software. My main goal with the eye-tracking software next week will be to be able to identify major sections of the screen, such as breaking the screen up into four equally sized quadrants and determining which one the user is looking at. Then, in later weeks, make the eye-tracking precise to the point where it can identify the user looking at the different commands displayed on the screen. Additionally, if the Adafruit 412 solenoids arrive, I will begin testing them with the team to ensure our design works, and that the solenoids will be functional for pressing the keys of a piano.

Next week, the Design Report will also be completed.

 

[1] “412_Web.” Adafruit Industries, 12 Oct. 2018. https://mm.digikey.com/Volume0/opasdata/d220001/medias/docus/21/412_Web.pdf

Team Status Report for 10/05/2024

A lot of time this week was spent preparing for our design review presentation. This meant refining and formalizing our ideas for what the software and hardware would both look like. This is a good starting point for us as we write our design report document next week.

One of our main risks right now is the interdependency of different parts of the project. Since there are many different subsystems that are all reliant on others in different ways, it would be difficult to develop some of them without considering how or if others are working. To combat this risk, we have been working on finding ways to divide the projects into more explicit sections, at least for initial design. For example, we expect to do initial testing of the eye-tracking accuracy via sub-sections of the screen, which does not require the UI to be completely finalized, and initial programming of the UI with buttons to create backend functionality without needing eye-tracking to be perfectly accurate (see Fiona’s Report for more on this). We will continue to consider this in our design going forward. 

We’ve always known that we need to convert the MIDI file output from the eye-tracking system to a format that is suitable for STM32, but only fleshed out the details on this while preparing for our design presentation this week. This step (the very first step in the “processing path” diagram below) isn’t just a trivial file reformatting or conversion into a different language. We need to process and extract key information, like the scheduling of solenoid on/off events and duration of each note. We will also need to filter out anything we won’t be implementing with our solenoid actuation, like note dynamics, which may have had default values assigned in the MIDI file. For the most efficiency, we want to ensure that the STM32 handles execution, not computation, meaning it should only receive the most essential data. So, instead of parsing the raw MIDI file on the STM32, a task that will be somewhat computationally heavy, we will use Python’s Mido library on our local machine. 

Last week we placed an order for two push-pull solenoids (Adafruit 412) for testing. We expected them to arrive this week but they have been delayed. When they do arrive, which is hopefully sometime this week, we will continue with the plan identified in our last weekly report: determining if they are sufficient for our project requirements and ordering more if they are.