Peter’s Status Report from 12/07/2024

This Week

This week was spent creating the mount to hold the solenoids in the case. The first iteration (not pictured) had too low a casing at the base, so the walls were extended upwards by 3mm. The holes for the mount in the back were also too low, so they were moved up 2mm, which matched perfectly with the solenoid when printed.

 Figure 1: Solenoid Bracket

 

The next step, which will be done Sunday, is to space 13 of these in a way that will match up with the piano and attach them all in one case.

 

Next Week

Next week will be spent finalizing the case, doing testing, and completing the deliverables for the Capstone Course.

Peter’s Status Report from 11/30/2024

Week Nov 17-23.

Using OpenCV to get frames to pass into a Mediapipe function to track the user’s irises, I was able to create a basic gaze-tracker that estimates where you are looking on a computer screen based on the location of the user’s irises. In Figure 1, the blue dot—located to the left of the head—shows the user the estimated location of where they are looking on the screen. Currently, there are two main configurations that the gaze-tracking can have. Configuration (1) has precise gaze tracking that requires head movement to make up for the small movements of the eyes, and for Configuration (2) gaze tracking only requires eye-movement, but also requires the head to be kept impractically still and has jittery estimates. In order to improve the Configuration 2, the movement of the head needs to be taken into consideration when calculating the estimated gaze of the user.

 

Figure 1: Mediapipe gaze-tracker

 

Week Nov 24-30.

Tested the accuracy of Fiona’s implementation of UI and eye-tracking integration. To test accuracy of the current gaze-tracking implementation, in Configuration 1 and Configuration 2, we looked at each command twice, and if it did not correctly identify a command, I kept trying to select the intended command until the correct command was selected. Using this method to test accuracy, Configuration 1 had precise control and, had 100% accuracy. Configuration 2, while having 89.6% accuracy in testing, had a very jittery gaze estimation, making it difficult to feel confident about the the cursor’s movements, and the user’s head has to be kept too still to be practical for widespread use. Preferably, the user only needs to move their eyes to track their gaze. As a result, the eye-tracking will be updated next week to take head movement into consideration, hopefully making the gaze estimate more smooth.

 

Tools for learning new knowledge

Using Mediapipe and OpenCV are new to me. To get comfortable with these libraries, read the online Mediapipe documentation and I followed different online video tutorials. Following these tutorials, I was able to discover applications for the Mediapipe library functions that were useful for my implementation.

 

This Week

This week, I hope to complete the eye-tracking, taking into consideration head movements to make, what is currently being referred to as, configuration 2 a more viable solution.

Peter’s Status Report from 11/16/24

This week Fiona and I met with Professor Savvides’s staff member, Magesh, to discuss how we would develop the eye-tracking using computer vision. Magesh gave us the following implementation plan.

  • Start with openCV video feed
  • Send frames to mediapipe python library
  • Mediapipe returns landmarks
  • From landmarks, select points that correspond to the eye-region
  • Determine if you are looking up, down, left, or right
  • Draw a point on the video feed to show where the software thinks the user is looking so there is live feedback.

Drawing a point on the video feed will serve to verify that the software is properly tracking the user’s iris and correctly mapping its gaze to the screen.

So far, I have succeeded in having the openCV video feed appear. I am currently bug fixing to get a face-mesh to appear on the video feed, using mediapipe, to verify that the software is tracking the irises properly. I am using Google’s Face Landmark Detection Guide for Python to help me implement this software1. Once I am able to verify this, I will move on to using a face landmarker to interpret the gaze of the user’s irises on the screen, and return coordinates to draw a point where the expected gaze is on the screen.

 

Resources

 Google AI for Developers. (2024, November 4). Face Landmark Detection Guide for Python. Google. https://ai.google.dev/edge/mediapipe/solutions/vision/face_landmarker/python

Peter’s Status Report from 11/2/2024

This Week

This week was supposed to be the week that basic eye-tracking was implemented; basic eye-tracking has not yet been implemented. Much of this week was spent fiddling around with the eye-tracking software to try to get it to properly detect eyes. However, the current method is not very reliable at detecting eyes in the setting of a regularly lit environment with lights overhead; perhaps forward lighting would help, but this would be an odd addition given our desire to have an easy set-up with few parts. However, it may be beneficial to consider pivoting to using an eye-tracking camera to support the eye-tracking system. This would allow our eye-tracking software to be more accurate.

 

Next Week

I will do more research on eye-tracking cameras, and discuss this pivot point with our faculty and TA advisors. If we are in agreement, then I will pivot to using the eye-tracking camera to aid in the accuracy of our system; currently out system cannot consistently track both eyes.

Peter’s Status Report for 10/26/24

This Week

This week was mainly spent coming up with measurements for and modeling the 3D case for our product, see Figure 1. Part of this involved modeling the Adafruit 412 solenoids, see Figure 2, in solidworks, since models could not be found online. The length of travel of the solenoid was initially a concern, since it is only 4mm and a keyboard’s keys have a travel length of 10mm. However, by positioning the solenoids on the keyboard such that the keys are always depressed by 5mm, the solenoids are able to activate the piano keys with only 4mm of travel. In short, the ability of the Adafruit 412 solenoids to depress the piano keys enough to activate the keys is not a concern.

 

Figure 1: Solenoid case

 

Figure 2: Adafruit 412 3D model

 

Next Week

Next week, if the rest of the materials arrive, I will test our solenoid circuitry with Shravya. Otherwise, I will be solely focused on working on implementing the eye tracking software.

Peter’s Status Report for 10/20/2024

This Week

This week was spent working on the design report. Beyond the design report, I have begun working on the 3D printed solenoid case.

Fall Break

I started doing further research into how to implement eye-tracking. I was planning on completing more over Fall Break, but I was occupied with other work.

Next Week

Currently my progress is behind schedule. To catch up, I plan on focusing on eye-tracking implementation next week and, if the solenoids have arrived, testing the solenoids to ensure they work with our design. If they do, we will order enough for the full octave. I hope to have a rough eye-tracking implementation completed by the end of next week. I will pause work on the 3D printed solenoid case until after the solenoids have been confirmed to work for our design.

Peter’s Status Report for 10/05/2024

This Week

The majority of this week was spent preparing for the design review presentation. As part of this preparation, Shravya and I went over our designs for the control circuit for the solenoid control, and concluded that her design using an NMOS for low-side switching would be best for our design. If we used a PMOS, the design could be shifted to work as a high-side switch, but we decided to go with the circuitry more-closely resembling the low-side switch presented in the Adafruit 412 design document [1].

Shravya and I also reviewed our designs for the power regulation. We decided to use a 12V DC power supply adaptor that plugs into a wall outlet to be the main source of power to our solenoids and the Nucleo32 board. We will also use a 3.3V power regulator to decrease the voltage from 12V to 3.3V for an input to the Nucleo32 board.

Currently, my progress is behind schedule. Implementation of eye-tracking to identify major sections of a screen a user is looking at (unprecise eye-tracking implementation) has not been developed yet. Additionally, the design for the 3D printed solenoid case, which should have been done to make up for the solenoids not being able to be tested this week, has not been made yet either. To make up for this, the design for the 3D printed solenoid case will be completed on Sunday (October 6th, 2024), and the implementation of unprecise eye-tracking will be worked on this upcoming week. If implementation of unprecise eye-tracking is not completed this upcoming week, then I will continue to work on it over Fall Break (October 13th, 2024 to October 19th, 2024).

 

Next Week

This upcoming week, I will complete the first design for the 3D printed solenoid case and will begin work on implementing the eye-tracking software. My main goal with the eye-tracking software next week will be to be able to identify major sections of the screen, such as breaking the screen up into four equally sized quadrants and determining which one the user is looking at. Then, in later weeks, make the eye-tracking precise to the point where it can identify the user looking at the different commands displayed on the screen. Additionally, if the Adafruit 412 solenoids arrive, I will begin testing them with the team to ensure our design works, and that the solenoids will be functional for pressing the keys of a piano.

Next week, the Design Report will also be completed.

 

[1] “412_Web.” Adafruit Industries, 12 Oct. 2018. https://mm.digikey.com/Volume0/opasdata/d220001/medias/docus/21/412_Web.pdf

Peter’s Status Report for 9/28/2024

This Week

The majority of this week was spent researching solenoids and finding an affordable solenoid to fit our design requirements. I determined that we needed a push solenoid with a stroke length over 4mm (the travel depth of a membrane keyboard’s keys), to ensure notes can be fully depressed on a keyboard, and a duty cycle close to 100%, without being too expensive (preferably under $15), since we would need 13 solenoids for the final product. The solenoid with part number 412 from Adafruit Industries is documented to fulfill all of these requirements, with a 100% duty cycle and cost $7.50 each (without factoring in tax or shipping). A purchase order for 2 Adafruit 412 solenoids was placed on Wednesday so that we may test the solenoids to ensure they meet our requirements.

Additionally, I created a more detailed block diagram for the hardware aspects of the product, see Figure 1, and created a schematic to implement the 13 solenoids based on the Adafruit 412 documentation [1], and my report from last week where it was discussed that a PMOS’s gate may be used as an enable line, see Figure 2.


Figure 1: Hardware Block Diagram

Figure 2: Solenoids Schematic

While I am on track currently, the delivery date of the solenoids may cause some delay in the testing and further development of our design. If this occurs, I can begin work on the 3D model for the solenoid’s case early. Additionally, Shravya and I could go over our designs together and model how they may interact to be better prepared for when the solenoids arrive.

 

Next Week

In the coming week, I plan to test the solenoids with Shravya, and test our circuit designs. Additionally, I want to spend more time looking over Shravya’s solenoid circuit design that uses an NMOS for a common source amplifier and compare it to the PMOS design in Figure 2.

 

[1] “412_Web.” Adafruit Industries, 12 Oct. 2018. https://mm.digikey.com/Volume0/opasdata/d220001/medias/docus/21/412_Web.pdf

Peter’s Status Report for 09/21/2024

This Week

The first half of this week was spent preparing for the proposal presentation and working together as a team to ensure we knew our product well going into the presentation. Post-presentation, we altered our Gantt chart to put me in charge of researching solenoids and solenoid control. What I have learned is that there are some solenoids that pull and some solenoids that push and pull, aptly referred to as push-pull solenoids. Push-pull solenoids are the type of solenoid we will be using for our project, as we require solenoids that can push the piano keys down when powered. 

GPIO pins of an STM32 can output 3.3V, but a push-pull solenoid can require 12V. So, in order to control a solenoid with GPIO pins from an STM32 chip, as we plan to do, we will need to have the STM32’s GPIO pin output of 3.3V act as an enable line. One way to do this may be through using a PMOS’s gate as an enable line.

 

Next Week

Schematics will be made for the solenoid system of solenoid control, along with calculations to justify the design. These should be completed by the end of this upcoming week (week of 09/22/2024). I will also speak with Shravya about her plans for power management to ensure that our designs will work well together.

As a team, we will be working together on the design presentations.

I am on schedule currently and am prepared to allot more time in the coming week towards designing the solenoids control system.