Sanjana’s Status Report for 12/2

This week witnessed significant strides in the progress of each subsystem. The frontend has achieved full functionality, seamlessly interfacing with the backend to receive updates and dynamically present a real-time cursor through the utilization of Asynchronous JavaScript and XML (AJAX). Using AJAX marked a pivotal breakthrough, because alternatives like web sockets  proved incompatible with our codebase. Had AJAX not proven successful, I would have had to refactor the entire codebase to use React and state hooks for the real-time component.

We encountered a huge challenge this week when the PvRecorder library had a native sampling rate of 16000 Hz and we wanted to sample the audio at 48100 Hz. This resulted in over 8 hours of changing and debugging the source code files of PvRecorder. Without any luck, we resorted to other methods of fixing the bugs we saw, and happened to wipe Caleb’s laptop. With this setback, we had to dedicate over a day recovering code, files, reconstructing environments, and installing dependencies. Despite this challenge, we have worked many many hours and gained back the lost progress. The audio subsystem has less noisy chroma vectors without the fifth harmonic interference issues we dealt with last week. This was achieved by adjusting the microphone settings some more and not the sampling rate. To continue developing the subsystems without Caleb’s laptop, we have installed several programs for running the audio subsystem and the frontend completely from Rohan’s computer as Sanjana’s mac does not support the Tobii Eye Tracker. Although operational challenges come with the use of two distinct laptops, we concluded that maintaining two fully functional systems is the optimal approach following this week’s unfortunate incident.

I also assisted Rohan on the eye-tracking subsystem. We made several updates to the algorithm that allow for more functionality and perhaps could be a standalone subsystem with a little more work and testing. In addition to the override buttons from the previous week, we implemented line tracking to determine which line a user is looking at. From there, we have linearly extrapolated the predicted cursor speed using the last recorded speed of eye movement across a line. This information is being tested to see if the cursor can also be updated with just the eye tracking subsystem.

In summary, this week witnessed an exhaustive commitment of hours, driving the project towards its conclusion with the successful integration of diverse subsystems.

Leave a Reply

Your email address will not be published. Required fields are marked *