Peter’s Status Report from 12/07/2024

This Week

This week was spent creating the mount to hold the solenoids in the case. The first iteration (not pictured) had too low a casing at the base, so the walls were extended upwards by 3mm. The holes for the mount in the back were also too low, so they were moved up 2mm, which matched perfectly with the solenoid when printed.

 Figure 1: Solenoid Bracket

 

The next step, which will be done Sunday, is to space 13 of these in a way that will match up with the piano and attach them all in one case.

 

Next Week

Next week will be spent finalizing the case, doing testing, and completing the deliverables for the Capstone Course.

Peter’s Status Report from 11/30/2024

Week Nov 17-23.

Using OpenCV to get frames to pass into a Mediapipe function to track the user’s irises, I was able to create a basic gaze-tracker that estimates where you are looking on a computer screen based on the location of the user’s irises. In Figure 1, the blue dot—located to the left of the head—shows the user the estimated location of where they are looking on the screen. Currently, there are two main configurations that the gaze-tracking can have. Configuration (1) has precise gaze tracking that requires head movement to make up for the small movements of the eyes, and for Configuration (2) gaze tracking only requires eye-movement, but also requires the head to be kept impractically still and has jittery estimates. In order to improve the Configuration 2, the movement of the head needs to be taken into consideration when calculating the estimated gaze of the user.

 

Figure 1: Mediapipe gaze-tracker

 

Week Nov 24-30.

Tested the accuracy of Fiona’s implementation of UI and eye-tracking integration. To test accuracy of the current gaze-tracking implementation, in Configuration 1 and Configuration 2, we looked at each command twice, and if it did not correctly identify a command, I kept trying to select the intended command until the correct command was selected. Using this method to test accuracy, Configuration 1 had precise control and, had 100% accuracy. Configuration 2, while having 89.6% accuracy in testing, had a very jittery gaze estimation, making it difficult to feel confident about the the cursor’s movements, and the user’s head has to be kept too still to be practical for widespread use. Preferably, the user only needs to move their eyes to track their gaze. As a result, the eye-tracking will be updated next week to take head movement into consideration, hopefully making the gaze estimate more smooth.

 

Tools for learning new knowledge

Using Mediapipe and OpenCV are new to me. To get comfortable with these libraries, read the online Mediapipe documentation and I followed different online video tutorials. Following these tutorials, I was able to discover applications for the Mediapipe library functions that were useful for my implementation.

 

This Week

This week, I hope to complete the eye-tracking, taking into consideration head movements to make, what is currently being referred to as, configuration 2 a more viable solution.

Peter’s Status Report from 11/16/24

This week Fiona and I met with Professor Savvides’s staff member, Magesh, to discuss how we would develop the eye-tracking using computer vision. Magesh gave us the following implementation plan.

  • Start with openCV video feed
  • Send frames to mediapipe python library
  • Mediapipe returns landmarks
  • From landmarks, select points that correspond to the eye-region
  • Determine if you are looking up, down, left, or right
  • Draw a point on the video feed to show where the software thinks the user is looking so there is live feedback.

Drawing a point on the video feed will serve to verify that the software is properly tracking the user’s iris and correctly mapping its gaze to the screen.

So far, I have succeeded in having the openCV video feed appear. I am currently bug fixing to get a face-mesh to appear on the video feed, using mediapipe, to verify that the software is tracking the irises properly. I am using Google’s Face Landmark Detection Guide for Python to help me implement this software1. Once I am able to verify this, I will move on to using a face landmarker to interpret the gaze of the user’s irises on the screen, and return coordinates to draw a point where the expected gaze is on the screen.

 

Resources

 Google AI for Developers. (2024, November 4). Face Landmark Detection Guide for Python. Google. https://ai.google.dev/edge/mediapipe/solutions/vision/face_landmarker/python

Peter’s Status Report from 11/9/2024

This Week

This week was spent testing our solenoid circuit with the newly arrived parts. The circuit worked mostly as intended. The solenoid extends when the NMOS’s gate is powered to 3.3V, and turns off when the NMOS’s gate is not powered. However, there is an unexpected delay, usually a few seconds, between the NMOS’s gate being powered off and the solenoid retracting. When the power is turned off from the power brick, there is no delay. Further testing will be done to determine what is causing this unexpected behavior.

Additionally, Professor Bain helped to connect me with Professor Savvides and one of his staff members to help with develop eye tracking using computer vision for use in our project. 

 

Next Week

Next week Fiona and I will meet with Professor Savvides’s staff member, Magesh, to help us understand how to develop an eye tracking system with computer vision. Additionally, I will do further testing with Shravya to find out why the solenoid remains extended, as if powered, after the NMOS’s gate is unpowered.

Peter’s Status Report from 11/2/2024

This Week

This week was supposed to be the week that basic eye-tracking was implemented; basic eye-tracking has not yet been implemented. Much of this week was spent fiddling around with the eye-tracking software to try to get it to properly detect eyes. However, the current method is not very reliable at detecting eyes in the setting of a regularly lit environment with lights overhead; perhaps forward lighting would help, but this would be an odd addition given our desire to have an easy set-up with few parts. However, it may be beneficial to consider pivoting to using an eye-tracking camera to support the eye-tracking system. This would allow our eye-tracking software to be more accurate.

 

Next Week

I will do more research on eye-tracking cameras, and discuss this pivot point with our faculty and TA advisors. If we are in agreement, then I will pivot to using the eye-tracking camera to aid in the accuracy of our system; currently out system cannot consistently track both eyes.

Team Status Report for 11/02/2024

On Sunday, we met as a group and discussed our individual ethics assignments. We identified some ways that our project could cause harm, such as malfunctioning after the user has become dependent on it for income or another need, or demonstrating racial bias, which we know is a grave issue in other facial recognition technologies. Then, we discussed our project with our peers during class on Monday, and they pointed out some more ethical considerations, such as the potential of the solenoids to break the piano and eye-strain from looking at the UI for long periods of time. This week, the biggest risk we have been considering are all of these ethical considerations.

We also worked on some tasks individually. Fiona continued to work on the MIDI backend, including integrating existing MIDI to sheet music code with our project. She also worked on the application frontend. Shravya worked to finish MIDI file parsing and set up a framework for testing this parsing feature on sample MIDI files, and began working on firmware (UART communication and actuating solenoids). Peter continues work on the eye tracking software and is considering using eye tracking cameras instead of computer vision to help increase the accuracy of the eye tracking program.

Team Status Report for 10/26/24

Team Status Report for 10/26/2024

This week was mostly spent on individual work. Fiona worked on the UI responses to update the MIDI file in the system, locating a MIDI to sheet music conversion software and mapping coordinates to responses in the code. Shravya worked on the MIDI-to-firmware conversion code, and devised small-scale testing plans to ensure functionality of components that will arrive this week. Peter worked on the 3D model for the box which will hold the components of our product.

More specifics are in the individual progress reports.

Our current major risk is that we are behind schedule, but we allocated slack time at the beginning of the semester for this scenario.

Next week the group will work on the ethics assignment together by meeting to discuss our responses. 

Peter’s Status Report for 10/26/24

This Week

This week was mainly spent coming up with measurements for and modeling the 3D case for our product, see Figure 1. Part of this involved modeling the Adafruit 412 solenoids, see Figure 2, in solidworks, since models could not be found online. The length of travel of the solenoid was initially a concern, since it is only 4mm and a keyboard’s keys have a travel length of 10mm. However, by positioning the solenoids on the keyboard such that the keys are always depressed by 5mm, the solenoids are able to activate the piano keys with only 4mm of travel. In short, the ability of the Adafruit 412 solenoids to depress the piano keys enough to activate the keys is not a concern.

 

Figure 1: Solenoid case

 

Figure 2: Adafruit 412 3D model

 

Next Week

Next week, if the rest of the materials arrive, I will test our solenoid circuitry with Shravya. Otherwise, I will be solely focused on working on implementing the eye tracking software.

Peter’s Status Report for 10/20/2024

This Week

This week was spent working on the design report. Beyond the design report, I have begun working on the 3D printed solenoid case.

Fall Break

I started doing further research into how to implement eye-tracking. I was planning on completing more over Fall Break, but I was occupied with other work.

Next Week

Currently my progress is behind schedule. To catch up, I plan on focusing on eye-tracking implementation next week and, if the solenoids have arrived, testing the solenoids to ensure they work with our design. If they do, we will order enough for the full octave. I hope to have a rough eye-tracking implementation completed by the end of next week. I will pause work on the 3D printed solenoid case until after the solenoids have been confirmed to work for our design.

Peter’s Status Report for 10/05/2024

This Week

The majority of this week was spent preparing for the design review presentation. As part of this preparation, Shravya and I went over our designs for the control circuit for the solenoid control, and concluded that her design using an NMOS for low-side switching would be best for our design. If we used a PMOS, the design could be shifted to work as a high-side switch, but we decided to go with the circuitry more-closely resembling the low-side switch presented in the Adafruit 412 design document [1].

Shravya and I also reviewed our designs for the power regulation. We decided to use a 12V DC power supply adaptor that plugs into a wall outlet to be the main source of power to our solenoids and the Nucleo32 board. We will also use a 3.3V power regulator to decrease the voltage from 12V to 3.3V for an input to the Nucleo32 board.

Currently, my progress is behind schedule. Implementation of eye-tracking to identify major sections of a screen a user is looking at (unprecise eye-tracking implementation) has not been developed yet. Additionally, the design for the 3D printed solenoid case, which should have been done to make up for the solenoids not being able to be tested this week, has not been made yet either. To make up for this, the design for the 3D printed solenoid case will be completed on Sunday (October 6th, 2024), and the implementation of unprecise eye-tracking will be worked on this upcoming week. If implementation of unprecise eye-tracking is not completed this upcoming week, then I will continue to work on it over Fall Break (October 13th, 2024 to October 19th, 2024).

 

Next Week

This upcoming week, I will complete the first design for the 3D printed solenoid case and will begin work on implementing the eye-tracking software. My main goal with the eye-tracking software next week will be to be able to identify major sections of the screen, such as breaking the screen up into four equally sized quadrants and determining which one the user is looking at. Then, in later weeks, make the eye-tracking precise to the point where it can identify the user looking at the different commands displayed on the screen. Additionally, if the Adafruit 412 solenoids arrive, I will begin testing them with the team to ensure our design works, and that the solenoids will be functional for pressing the keys of a piano.

Next week, the Design Report will also be completed.

 

[1] “412_Web.” Adafruit Industries, 12 Oct. 2018. https://mm.digikey.com/Volume0/opasdata/d220001/medias/docus/21/412_Web.pdf