Fiona’s Status Report for 11/02/2024

This Week

Ethics

On Sunday, I met with my group to discuss the ethical considerations of our project, and discussed again with our classmates on Monday for an outside perspective.

Secondary UI (Sheet Music Updates)

I downloaded the code I identified last week as being a candidate for MIDI to sheet music conversion [1][2], and also Mono, the framework the author used [3]. I had to make one simple edit to the makefile in order for the program to run, but otherwise the code was compatible with the most current version of Mono, despite being 11 years old.

From there, it was fairly straightforward to implement the functionality to convert the MIDI file to sheet music on button press. To finish this task, I added some code to make the image update in the system on each edit by the user. For file saving and running the executable, is used the os module in Python [4].

Below is an example of what the sheet music output might look like for a simple sequence of notes.

Seeing the sheet music in front of me made me realize that my program had been saving the MIDI notes in the wrong way. The note pitches appeared to be correct, but the lengths were note, and rests appeared that I had not placed.

MIDI File Updates [9]

Because of this, I had to go back into my code from last week and identify the issue. I examined the Mido messages from one of the example MIDI files in BYVoid’s repository [1] against the sheet music it generated, and discovered I had misunderstood the Mido “time” parameter; I thought it was relative to the entire piece, but rather it is relative to the note itself (so, notes start at 0 unless preceded by a rest). After fixing that error in my code, it appears that time and pitch are both correct.

I also added the functionality to create a new MIDI file from within the UI [4], which will allow the user to create multiple compositions without opening and closing the application. Additionally, I coded a function that allows the user to open an existing MIDI file from the UI, using a Tkinter module, filedialog [5].

Finally, I added the code to move the cursor backwards and forwards in the MIDI file, inserting and deleting notes at those locations. This marks the completion of the MIDI file responses task on the Gantt chart.

Frontend

Next, I started working on finalizing the frontend code for the project. Previously, I had been using a UI for testing responses with buttons, but we will also need a final UI for the eye-tracking functionality, so I started writing that, see below [6][7].

Among other changes, I added progress bars, which is a widget that the Tkinter library offers [8], above each command in order to make it easier to add the functionality to show the user how long they have to look at a command.

Right now the UI is pretty simple; I will ask my group if they think we should incorporate any colors or other design facets into it. I also would like to test the UI on other machines to ensure that the size-factor is not off on different screens.

UI Responses

In the UI, I set up some preliminary functionality for UI responses, like a variable string input to the message box, and each of the progress bars. I did not make other progress on the UI responses.

Next Week

I am still a little behind in the Gantt chart, but making good progress. From the previous tasks, I still need to complete:

  • the integration between the MIDI to sheet music code and our system, such that the current cursor location is marked in the sheet music. This will require me to update the MIDI to sheet music code [1].
  • the identification of the coordinate range of the UI commands, which can be done now that I have a final idea of the UI.
  • the coordinates to commands mapping program, and loading bars. I’ve outlined some basic code already, but I will need to start integrating with the eye-tracking first to make sure I’ve got the right idea of it.
  • error messages and alerts to the user.

In the Gantt chart, my task next week is to integrate the primary and secondary UI, so I will work on that.

Another thing I want to do next week which is not on the Gantt chart is organize my current code for readability and style, and upload it to a GitHub repository, since there are more and longer files now.

References

[1] BYVoid. (2013, May 9) MidiToSheetMusic. GitHub. https://github.com/BYVoid/MidiToSheetMusic

[2] Vaidyanathan, M. Convert MIDI Files to Sheet Music. Midi Sheet Music. (n.d.). http://midisheetmusic.com/

[3] Download. Mono. (2024). https://www.mono-project.com/download/stable/

[4] os – Miscellaneous operating system interfaces. python. (n.d.). https://docs.python.org/3/library/os.html

[5] Tkinter dialogs. Python documentation. (n.d.). https://docs.python.org/3.13/library/dialog.html

[6] Shipman, J.W. (2013, Dec 31). Tkinter 8.5 reference: a GUI for Python. tkdocs. https://tkdocs.com/shipman/tkinter.pdf

[7] Graphical User Interfaces with Tk. python. (n.d.). https://docs.python.org/3.13/library/tk.html

[8] tkinter.tkk – Tk themed widgets. Python documentation. (n.d.). https://docs.python.org/3.13/library/tkinter.ttk.html

[9] Overview. Mido – MIDI Objects for Python. (n.d.). https://mido.readthedocs.io/en/stable/index.html

Team Status Report for 10/26/24

Team Status Report for 10/26/2024

This week was mostly spent on individual work. Fiona worked on the UI responses to update the MIDI file in the system, locating a MIDI to sheet music conversion software and mapping coordinates to responses in the code. Shravya worked on the MIDI-to-firmware conversion code, and devised small-scale testing plans to ensure functionality of components that will arrive this week. Peter worked on the 3D model for the box which will hold the components of our product.

More specifics are in the individual progress reports.

Our current major risk is that we are behind schedule, but we allocated slack time at the beginning of the semester for this scenario.

Next week the group will work on the ethics assignment together by meeting to discuss our responses. 

Peter’s Status Report for 10/26/24

This Week

This week was mainly spent coming up with measurements for and modeling the 3D case for our product, see Figure 1. Part of this involved modeling the Adafruit 412 solenoids, see Figure 2, in solidworks, since models could not be found online. The length of travel of the solenoid was initially a concern, since it is only 4mm and a keyboard’s keys have a travel length of 10mm. However, by positioning the solenoids on the keyboard such that the keys are always depressed by 5mm, the solenoids are able to activate the piano keys with only 4mm of travel. In short, the ability of the Adafruit 412 solenoids to depress the piano keys enough to activate the keys is not a concern.

 

Figure 1: Solenoid case

 

Figure 2: Adafruit 412 3D model

 

Next Week

Next week, if the rest of the materials arrive, I will test our solenoid circuitry with Shravya. Otherwise, I will be solely focused on working on implementing the eye tracking software.

Fiona’s Status Report for 10/26/2024

This Week

This week, I worked on the assigned reading and write-up for the ethics assignment.

MIDI File Responses

I also started working on the backend for the MIDI file saving and editing. I first tested with the Mido Python library [1], which has functions for MIDI file editing. Using the frontend framework I made with Tkinter a few weeks ago, I was able to write a sequence of note pitches and lengths into a MIDI file by button press. In order to do so, I found a MIDI file spec that details the way the files are formatted and how to designate pitch and tempo [2].

One issue I am struggling with right now for the MIDI file saving backend is determining the best way to allow the user to edit the MIDI file name and location, while still keeping our UI simple and easy to use. I will continue to work on that problem.

Secondary UI

This week, I also identified some code we may be able to use in order to convert the MIDI file data into sheet music [3]. This code is a simplified version of another project [4], but I might try to simplify it further for our project.

User Interface

I made some minor edits to the user interface framework this week. For the time being, I am continuing to code the UI using the Python library Tkinter, but if I run into trouble later on, I plan to try using Django.

Mapping Coordinates to the UI

The last thing I started working on this week was the code that maps eye-coordinates to command responses on the UI. Previously, I had been testing UI responses with a button press, since we have not finished implementing eye tracking yet, but once we do, I want to have some code finished to connect the eye-tracking to the UI responses.

Next Week

The MIDI file saving is not complete: I still need to add file saving/opening functionality, and the functionality to move the cursor backwards and forwards in the piece.

Next week, I will also begin working on the UI responses (e.g., something to indicate that the user has pressed a button, error messages, etc.), which means I will need to do more work to finalize the UI.

I also plan to begin integrating the MIDI to sheet music conversion code with ours next week and continue working on mapping coordinates to responses.

I am still behind schedule this week, but I have made considerable progress in the areas that I wanted to.

References

[1] Overview. Mido – MIDI Objects for Python. https://mido.readthedocs.io/en/stable/index.html

[2] Standard MIDI-File Format Spec. 1.1, updated. McGill. http://www.music.mcgill.ca/~ich/classes/mumt306/StandardMIDIfileformat.html

[3] BYVoid. (2013, May 9) MidiToSheetMusic. GitHub. https://github.com/BYVoid/MidiToSheetMusic

[4] Vaidyanathan, M. Convert MIDI Files to Sheet Music. Midi Sheet Music. http://midisheetmusic.com/

 

Shravya’s status report for 10/26/2024

This week I worked on my ethics assignment, spent some time with my teammates processing the feedback we got from staff regarding our design report, worked on my MIDI to firmware conversion code, and devised some small-scale testing plans to ensure the components we ordered will work correctly.

Initial components testing

I am waiting for some more components (for the solenoid control circuitry) to arrive this week. Once they do, I will probably spend an evening running tests such as the following:

 

Test the Solenoid

  • Activation Test: Connect the solenoid directly to the power supply and confirm it activates within the specified current range.
  • Power Consumption: Measure the actual current draw and compare it with specifications.
  • Thermal Testing: Run the solenoid for an extended period and monitor for any significant heating to ensure it can sustain operation without overheating.
  • Measure any delays in reaction time

 

Test flyback diode reliability

  • With the diode installed, use an oscilloscope to measure voltage at the MOSFET drain (or across the solenoid) when the circuit is deactivated. Take note of the voltage spike when switching off power to the solenoid and note how effectively the diode suppresses back EMF.
  • One piece of feedback I received from the design report on a section I wrote was that the equation I used for energy dissipation wasn’t extremely necessary to include and rather I should have included a Vspike equation to quantify the back-voltage. I agree with this feedback, and when I run physical tests I will take note of the Vspike and how that may affect other quantities; I will include this in the final report.

 

Test the MOSFET

  • Switching Test: Use a low-frequency signal from the function generator to control the MOSFET, checking that it switches the solenoid on and off without issues.
  • Thermal Stability: Measure temperature over a few switching cycles. Confirm that the MOSFET consistently stays within a safe temperature range

 

My progress on MIDI to firmware conversion code

I have gotten a grasp on how to parse MIDI with Python’s Mido library to extract note events. I am also able to generate C-compatible data to output an array of structs in a C header file. I will keep fine-tuning this as necessary and this week I will also work on writing the final “integration” code, as in, using the C array to send commands from the STM32 to actuate the solenoids. I have attached a screenshot of some of my code.

I am on schedule. I hope to have the bulk of my components testing finished this week as well as a bare-bones functioning midi-to-firmware conversion code (I want it to function save for any edge cases, minor bugs, etc). I will also be working more on the ethics assignment this week after having pod discussion in lecture.

Peter’s Status Report for 10/20/2024

This Week

This week was spent working on the design report. Beyond the design report, I have begun working on the 3D printed solenoid case.

Fall Break

I started doing further research into how to implement eye-tracking. I was planning on completing more over Fall Break, but I was occupied with other work.

Next Week

Currently my progress is behind schedule. To catch up, I plan on focusing on eye-tracking implementation next week and, if the solenoids have arrived, testing the solenoids to ensure they work with our design. If they do, we will order enough for the full octave. I hope to have a rough eye-tracking implementation completed by the end of next week. I will pause work on the 3D printed solenoid case until after the solenoids have been confirmed to work for our design.

Team Status Report for 10/20/2024

As a team, we spent most of Week 7 working on our design report. This was a time intensive experience, as we had to do a lot of research, and discuss within our group facets of our design, testing and purpose, as well as how to convey all of that information efficiently and understandably in our report. 

Because of this, our main risk right now is that we have fallen behind schedule. We had planned to have made further progress with the eye-tracking software and application at this point in the semester. However, we did allot weeks 12-14 and finals week for slack time, so we are hopeful that we have ample time to catch up and complete the tasks we have set for ourselves.

While working on the design report, we updated our Gantt chart, see below.

Global Considerations, Written by Shravya

Our solution goes beyond local or technological communities. Globally, there are millions of people living with disabilities (particularly, disabilities relating to hand/arm mobility) who may not have access to specialized instruments or music creation tools tailored to their specific needs. These people exist across various socio-economic and geographical (urban vs remote) contexts. Our solution offers a low-cost, technologically accessible means of composing and playing music, making it a viable option not just in academic or well-funded environments, but also in regions with limited access to specialized tools. By providing compatibility with widely used MIDI files, minimal physical set-up, and an eye-tracking interface we aim to make as user-intuitive as possible, users around the world will be able to express themselves musically without extensive training or high-end technology. 

Cultural Considerations, Written by Fiona

Music is an important part of many cultural traditions. It is a tool for communication, and allows people to share stories and art across generations and between cultures. For example, many countries have national anthems that can be used to communicate what is important to that country. Broader access to music is thus important to many people, because it would allow them to participate in their culture or others, if they wished. Recognizing the importance of music for many individuals and groups, we hope that our project can be a stepping stone for more accessibility to musical cultural traditions.

Environmental Considerations, Written by Peter

By utilizing laptop hardware which is likely to already be owned by users, we are able to reduce the amount of electronic waste, which can be toxic and nonbiodegradable [1], that could be created by our product. Along with this, by working to minimize our products’ power consumption, we are minimizing our products’ contribution to pollution that results from non-renewable energy sources.

References

[1] Geneva Environment Network. (2024, October 9). The Growing Environmental Risks of E-Waste. Geneva Environment Network. https://www.genevaenvironmentnetwork.org/resources/updates/the-growing-environmental-risks-of-e-waste/ 

Shravya’s Status Report for 10/20/2024

Week before fall break

All of my time this week was focused on completing the design report. We had meetings on 4 of the days and there was plenty of individual work on top of that.

Fall break

Over fall break, I was extremely occupied with a project and homework for my 18421 class.

Next week

I am waiting on a preliminary order of solenoid circuit components, and as soon as they arrive, I can expect to spend a few hours testing them out with a power supply and oscilloscope. Testing them out physically will prove more fruitful than Cadence simulations, I’ve come to realise- Cadence is too ideal, and it takes time for me to introduce parasitics to make components behave more realistic. It’s simply not a good use of my time when I could be working with the real components, and Joshna and Professor Bain agreed with me on this in our most recent meeting.

This week I will also start writing code for MIDI to firmware conversion using the python MIDO library.

I have indeed fallen slightly behind schedule but I am confident I can get back up to speed this week. Besides, we had allocated about 2 weeks of slack time at the end of the semester in our Gantt chart, some of which will help me right now.

Fiona’s Status Report for 10/20/2024

This Week

All of my time this week was spent working on the design report. I researched eye-tracking in general, and when used for musical composition. I was able to find quite a few other projects with similar goals and/or functionalities to ours that we can use to inform our project and it’s goals [1], such as EyeHarp [2], E-Scape [3][4], and an academic project called EyeMusic [5]. All of these, and their relation to our project I described more in depth in the design report.

Other than the research and work I did for the introduction and related works, I also worked on several other sections of the report, including the use case and design requirements, (software) system implementation, testing, risk mitigation, and summary sections.

Fall Break

I planned to catch up on some of my previous tasks during Fall break. Specifically, I wanted to work on the frontend and determine how to map eye-coordinates to commands. However, I was unable to due to other commitments.

Next Week

Since I have fallen behind in our Gantt chart schedule, my main goal this week will be catching up on my work. To catch up, I will continue working on the frontend, which will include the deciding whether to continue coding with Tkinter, or to switch to Django, as was suggested to us by one of our advisors. Also, I will begin work on mapping eye coordinates to commands.

There are also a couple of other tasks I need to start working on this week: create responses to commands that would update the MIDI file, and integrating an existing MIDI to sheet music program with our system. Further, the ethics assignment is due this week, so I will have to work on that.

This is a fair amount of work to do in one week, so I don’t expect to finish all of these tasks completely, but I will start on all of them so I have an idea of how long they will take overall, and perhaps adjust the Gantt chart from there.

References

[1] O’Keeffe, K.. (2020, Feb 20). Eyegaze for Musical Expression. Assistive Technology and Me. https://www.atandme.com/eyegaze-for-musical-expression/

[2] FAQs. EyeHarp. https://eyeharp.org/faq/

[3] Anderson, T. Eyegaze music. Inclusive Music. https://www.inclusivemusic.org.uk/using-e-scape-with-eyegaze/

[4] Drake Music. (2014, Nov 24). EyeGaze composing with E-Scape. Vimeo. https://vimeo.com/112689731

[5] Hornof, A. J., & Sato, L. (2004). EyeMusic: Making Music with the Eyes. Proceedings of the International Conference on New Interfaces for Musical Expression, 185–188. https://doi.org/10.5281/zenodo.1176613

Peter’s Status Report for 10/05/2024

This Week

The majority of this week was spent preparing for the design review presentation. As part of this preparation, Shravya and I went over our designs for the control circuit for the solenoid control, and concluded that her design using an NMOS for low-side switching would be best for our design. If we used a PMOS, the design could be shifted to work as a high-side switch, but we decided to go with the circuitry more-closely resembling the low-side switch presented in the Adafruit 412 design document [1].

Shravya and I also reviewed our designs for the power regulation. We decided to use a 12V DC power supply adaptor that plugs into a wall outlet to be the main source of power to our solenoids and the Nucleo32 board. We will also use a 3.3V power regulator to decrease the voltage from 12V to 3.3V for an input to the Nucleo32 board.

Currently, my progress is behind schedule. Implementation of eye-tracking to identify major sections of a screen a user is looking at (unprecise eye-tracking implementation) has not been developed yet. Additionally, the design for the 3D printed solenoid case, which should have been done to make up for the solenoids not being able to be tested this week, has not been made yet either. To make up for this, the design for the 3D printed solenoid case will be completed on Sunday (October 6th, 2024), and the implementation of unprecise eye-tracking will be worked on this upcoming week. If implementation of unprecise eye-tracking is not completed this upcoming week, then I will continue to work on it over Fall Break (October 13th, 2024 to October 19th, 2024).

 

Next Week

This upcoming week, I will complete the first design for the 3D printed solenoid case and will begin work on implementing the eye-tracking software. My main goal with the eye-tracking software next week will be to be able to identify major sections of the screen, such as breaking the screen up into four equally sized quadrants and determining which one the user is looking at. Then, in later weeks, make the eye-tracking precise to the point where it can identify the user looking at the different commands displayed on the screen. Additionally, if the Adafruit 412 solenoids arrive, I will begin testing them with the team to ensure our design works, and that the solenoids will be functional for pressing the keys of a piano.

Next week, the Design Report will also be completed.

 

[1] “412_Web.” Adafruit Industries, 12 Oct. 2018. https://mm.digikey.com/Volume0/opasdata/d220001/medias/docus/21/412_Web.pdf