Team Status Report for 10/20/2024

As a team, we spent most of Week 7 working on our design report. This was a time intensive experience, as we had to do a lot of research, and discuss within our group facets of our design, testing and purpose, as well as how to convey all of that information efficiently and understandably in our report. 

Because of this, our main risk right now is that we have fallen behind schedule. We had planned to have made further progress with the eye-tracking software and application at this point in the semester. However, we did allot weeks 12-14 and finals week for slack time, so we are hopeful that we have ample time to catch up and complete the tasks we have set for ourselves.

While working on the design report, we updated our Gantt chart, see below.

Global Considerations, Written by Shravya

Our solution goes beyond local or technological communities. Globally, there are millions of people living with disabilities (particularly, disabilities relating to hand/arm mobility) who may not have access to specialized instruments or music creation tools tailored to their specific needs. These people exist across various socio-economic and geographical (urban vs remote) contexts. Our solution offers a low-cost, technologically accessible means of composing and playing music, making it a viable option not just in academic or well-funded environments, but also in regions with limited access to specialized tools. By providing compatibility with widely used MIDI files, minimal physical set-up, and an eye-tracking interface we aim to make as user-intuitive as possible, users around the world will be able to express themselves musically without extensive training or high-end technology. 

Cultural Considerations, Written by Fiona

Music is an important part of many cultural traditions. It is a tool for communication, and allows people to share stories and art across generations and between cultures. For example, many countries have national anthems that can be used to communicate what is important to that country. Broader access to music is thus important to many people, because it would allow them to participate in their culture or others, if they wished. Recognizing the importance of music for many individuals and groups, we hope that our project can be a stepping stone for more accessibility to musical cultural traditions.

Environmental Considerations, Written by Peter

By utilizing laptop hardware which is likely to already be owned by users, we are able to reduce the amount of electronic waste, which can be toxic and nonbiodegradable [1], that could be created by our product. Along with this, by working to minimize our products’ power consumption, we are minimizing our products’ contribution to pollution that results from non-renewable energy sources.

References

[1] Geneva Environment Network. (2024, October 9). The Growing Environmental Risks of E-Waste. Geneva Environment Network. https://www.genevaenvironmentnetwork.org/resources/updates/the-growing-environmental-risks-of-e-waste/ 

Fiona’s Status Report for 10/20/2024

This Week

All of my time this week was spent working on the design report. I researched eye-tracking in general, and when used for musical composition. I was able to find quite a few other projects with similar goals and/or functionalities to ours that we can use to inform our project and it’s goals [1], such as EyeHarp [2], E-Scape [3][4], and an academic project called EyeMusic [5]. All of these, and their relation to our project I described more in depth in the design report.

Other than the research and work I did for the introduction and related works, I also worked on several other sections of the report, including the use case and design requirements, (software) system implementation, testing, risk mitigation, and summary sections.

Fall Break

I planned to catch up on some of my previous tasks during Fall break. Specifically, I wanted to work on the frontend and determine how to map eye-coordinates to commands. However, I was unable to due to other commitments.

Next Week

Since I have fallen behind in our Gantt chart schedule, my main goal this week will be catching up on my work. To catch up, I will continue working on the frontend, which will include the deciding whether to continue coding with Tkinter, or to switch to Django, as was suggested to us by one of our advisors. Also, I will begin work on mapping eye coordinates to commands.

There are also a couple of other tasks I need to start working on this week: create responses to commands that would update the MIDI file, and integrating an existing MIDI to sheet music program with our system. Further, the ethics assignment is due this week, so I will have to work on that.

This is a fair amount of work to do in one week, so I don’t expect to finish all of these tasks completely, but I will start on all of them so I have an idea of how long they will take overall, and perhaps adjust the Gantt chart from there.

References

[1] O’Keeffe, K.. (2020, Feb 20). Eyegaze for Musical Expression. Assistive Technology and Me. https://www.atandme.com/eyegaze-for-musical-expression/

[2] FAQs. EyeHarp. https://eyeharp.org/faq/

[3] Anderson, T. Eyegaze music. Inclusive Music. https://www.inclusivemusic.org.uk/using-e-scape-with-eyegaze/

[4] Drake Music. (2014, Nov 24). EyeGaze composing with E-Scape. Vimeo. https://vimeo.com/112689731

[5] Hornof, A. J., & Sato, L. (2004). EyeMusic: Making Music with the Eyes. Proceedings of the International Conference on New Interfaces for Musical Expression, 185–188. https://doi.org/10.5281/zenodo.1176613

Fiona’s Status Report for 10/05/2024

This Week

Design Presentation

This week, I helped my group members with our presentation slides and script for the design review. I also helped Peter prepare to present our design review to the class.

User Interface [1][2]

I also started working on programming our user interface in Python. To do so, I had to do some research on Python’s built-in GUI library, Tkinter, which I hadn’t used before. Fortunately, I was able to find two very helpful resources/documentation that really helped familiarize me with the way the library works [1][2].

I was able to create a preliminary UI, however, I had some trouble figuring out how to adjust the physical sizes of facets of the UI, which is important to our design, given that the buttons cannot be so small the eye-tracking software cannot determine when a user is looking at them.

I decided to put that issue on the back burner for the time being, and programmed each of the eye-commands to be buttons, which can take a function input in Tkinter. This will make it easier to write and test the command response backend code in the future, without having the full eye-tracking program written. What I mean by that is, with the button functionality, we will be able to test that the program is writing to the MIDI file and performing other commands by clicking on the screen, rather than first needing to integrate with the eye-tracking software.

Next Week

I am behind where I wanted to be at the end of this week, because of the trouble I had with the UI, but I hope to get caught up on that work next week.

I expect that most of my time next week will be spent on the design report. Specifically, I plan to do more research on what academics have experimented with for eye-tracking programs and hopefully use those to make informed decisions on how to write our software, all of which I will document in the design report.

Also I would like to complete my Gantt chart task for next week, which is to map coordinates to commands in the code.

[1] Shipman, J.W. (2013, Dec 31). Tkinter 8.5 reference: a GUI for Python. tkdocs. https://tkdocs.com/shipman/tkinter.pdf

[2] Graphical User Interfaces with Tk. python. https://docs.python.org/3.9/library/tk.html

Fiona’s Status Report for 09/28/2024

This Week

Eye Tracking [1][2]

I was able to set up live-video eye tracking locally on my computer with the help of the research I did last week [1]. I did so by copying over the code [2] and ensuring each step worked with my personal computer/set-up.

I extended some extra time to this process to make sure I understood what the code was doing, because we expect to have to modify some of it to integrate it with our user interface. Also, there was some debugging necessary to get the program to perform as expected, and I needed to modify the code in some places for testing.

I did run into some trouble setting up the threshold (a value to manage differences in lighting) “track bar” that was in the original project, maybe due to the fact that the cv2 library could have changed this functionality since the article and code were published in 2019. During testing, I changed the threshold value manually in the code, but I believe that I should be able to set up a track bar (or some other type of user input) while implementing the user interface.

There were a few differences I noticed while setting up the code. For example, the author of the article suggested that we cut eyebrows out from the eye block detected by the computer, but when I was testing with my face, the eye-detector did not include my eyebrows in the detection area. I wonder if different face structures produce different results in this area, so I would like to test the design on more people in the coming weeks.

Design Review

In preparation for the design review, I made a block diagram for the software side of the project, because I wanted to make sure that there weren’t any facets that were missed in our planning that we would later need to implement in a rush because we forgot. There are a lot of moving parts in the software side of the system.

After making this diagram, I realized I needed to reorder and redesign my tasks in the Gantt chart and to better fit the dependencies and difficulty of each task. Here are my new tasks for the next five weeks:

I was fairly busy this week, so I didn’t get as much coding done as I wanted to. However, I think that refining the software layout was a helpful first step towards the coding process.

Next Week

Based on my new schedule, I want to start working on the frontend next week. I have already done some preliminary research on GUIs in Python this week, but haven’t decided which library (of the many available) will best fit our needs, so that will be part of the process.

One of the most important parts of this step is to identify the coordinate ranges for each command to set myself up for success in the week afterwards, where I want to make the code convert coordinates to commands.

Another thing happening next week is the design presentations. After we get our feedback, we will review it and determine what changes we want to make while writing the Design Report, which is due at the end of Week 7.

[1] Filonov, S. (2019, Mar 22). Tracking your eyes with Python. Medium. https://medium.com/@stepanfilonov/tracking-your-eyes-with-python-3952e66194a6

[2]  Filonov, S. (2019, Mar 22). Eye-Tracker. GitHub. https://github.com/stepacool/Eye-Tracker/blob/No_GUI/track.py

Fiona’s Status Report for 09/21/2024

This Week

Most of my time this week was spent on the proposal presentation slides and preparing to present them to the class. 

In preparation, I conversed with my group members to ensure that I was able to fully represent our project to the class, and make sure we were on the same page for every section of the proposal. Primarily, I wanted to make sure that we had a full plan for carrying out testing on the system because I wanted to be able to explain it well to the class and we hadn’t explored that area as much in our previous discussions. 

I also helped refine the script we started working on last week to ensure it contained all the information we wanted and memorized it. I practiced the full script several times, first with Shravya and Peter, and then with my roommate, as well as multiple times on my own. 

After presenting, I was able to allot some time to researching eye-tracking with OpenCV. I went back to a resource [1] we identified while preparing our abstract, and summarized it down into actionable steps for us to pursue while implementing eye-tracking. According to the resource, we should cut the process into discrete steps: first identifying faces, then eyes, and then pupils, and also implementing the identification on still images before video [1]. I think this is a promising software we should start testing with immediately.

I am behind the schedule we set in the proposal, because I was not able to find the time to work on the UI backend this week. 

Next Week

Next week, our group will be working on the design presentation; we will expand on our proposal with the research we’ve completed and incorporate the feedback we received on the proposal, as we see necessary. I will assist my group members in this.

Since I have fallen behind, I will start working on the UI backend next week, along with the code to convert eye-commands to MIDI file format. After I start working on this software, I believe I will have a better idea of whether I need to re-evaluate my tasks on the Gantt chart. If necessary, I will re-define my tasks for implementing the UI in the chart before we present the design review. 

References

[1] Filonov, S. (2019, Mar 22). Tracking your eyes with Python. Medium. https://medium.com/@stepanfilonov/tracking-your-eyes-with-python-3952e66194a6

Team Status Report for 09/21/2024

A significant risk we are taking on is the inclusion of eye-tracking in our product’s design. No one on the team has background experience using or creating eye-tracking software, but we believe there will be ample time to learn since we are starting early. Additionally, we expect to be able to find open-source resources to customize for our system.

Figure 1: Block Diagram

In preparation for our proposal presentation, we finalized our use case requirements, block diagram design (Fig 1), and testing plans.

One question we received from professors/TAs after our presentation was whether we plan to implement varying pressures on solenoid presses. We will keep this suggestion in mind, and may consider it for future implementation after ensuring all other features are fully operational. It could help the piano output sound more realistic and humanized. 

We made a minor edit to the schedule: Peter took over the task “Research solenoid and solenoid control” because he is doing the schematic and layout for solenoid control. 

Figure 2: Gantt Chart