Team Status Report for 11/16/2024

This week, our team prepared for the interim demo that takes place next week. We met with our advisors, Professor Bain and Joshna, to receive advice on what to work on before our demo. Peter and Fiona also met with a grad student to discuss the eye-tracking implementation. 

Currently, our plan for testing is mostly unchanged, however it is reliant on the integration of our subsystems, which has not yet happened. This is why one of our biggest risks currently is running into major issues during integration. 

To recap, our current plan for testing is as follows.

  • For the software (user interface and eye-tracking software): We will run standardized tests of different series of commands on multiple users. We want to test with different users of the system in order to test different parameters, like face shape/size, response time, musical knowledge and familiarity with the UI. We plan for these tests to cover a range of different scenarios, like different expected command responses (on the backend) and different distances and/or time between consecutives commands. We also plan to test edge cases in the software, like the user moving out of the camera range, or a user attempting to open a file that doesn’t exist.
    • Each test will be video-recorded and eye-commands recognized by the backend will be printed to a file for comparison, both for accuracy (goal: 75%) and latency (goal: 500ms).
  • For the hardware (STM32 and solenoids): We will give different MIDI files to the firmware and microcontroller. Like with the software testing, we plan to test a variety of parameters, these include different tempos, note patterns, and the use of rests and chords. We also plan to stress test the hardware with longer MIDI files to see if there are compounding errors with tempo or accuracy that cannot be observed when testing with shorter files. 
    • To test the latency (goal: within 10% of BPM) and accuracy (goal: 100%) of the hardware, we will record the output of the hardwares commands on the piano with a metronome in the background. 
    • Power consumption (goal: ≤ 9W) of the hardware system will also be measured during this test.

We have also defined a new test for evaluating accessibility: we plan to test that we can perform every command we make available to the user without having to use the mouse or the keyboard, after setting up the software and hardware. An example of an edge case we would be testing during this stage is ensuring that improper use, like attempting to send the a composition to the solenoid system without the hardware being connected to the user’s computer, does not crash the program, but is rather handled within the application, allowing the user to correct their use and continue using just eye-commands.

Leave a Reply

Your email address will not be published. Required fields are marked *