What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).
I was sick for the majority this week and had to visit the hospital which hindered the amount of work I was able to do. That being said, I was able to complete the majority of my tasks.
I researched how to implement the through-board LED system. I thought of using a MAX7219 multiplexing LED chip, but after our faculty meeting and our pivot to magnetically controlled chess pieces, I realized this may actually be more complex than anticipated, as the electromagnet may destroy components of this circuit when moving pieces. As this is not a quality-of-life feature, and not a critical element of our design, I set this aside and focused on other tasks this week.
I ordered the Arduino Mega 2560 as well as cables to connect devices for the team. I also designed the communication protocol between the Arduino and Jetson. The Jetson will communicate over UART with the Arduino sending out messages as fast as the model can estimate gaze in the following format:
'''{row}{col}\n'''
The Arduino will only parse a line after seeing a newline character so that we can detect where messages start. We avoid fragmentation by discarding messages that don’t end in a newline character. The Arduino will select a move origin (or destination) as soon as it receives a complete coordinate message AND the user is using the lock-in mechanism (button/pedal).
I also researched libraries and implementation methods in case gaze estimation at our required accuracy is not possible and we are forced to pivot to having the user look at a screen. If this were the case, we would implement the eye-tracking and UI in Python. The eye-tracking would most likely be powered by the EyeGestures library. This would take in live webcam input from the user’s desktop machine and use it to output a coordinate of where on the screen the user is looking at. I would write a simple Python applet that displays a live-feed of a bird’s eye view of the board using the OpenCV Python library (this is just a commonly used live-feed library, OpenCV would not be actually using Computer Vision in this version of our project), then overlay the EyeGestures coordinate over the screen using tkinter to display where the user is looking. EyeGestures can also capture blinks, which the user would use to lock into a move.
Finally, I worked on designing and delivering the design presentation, which I will be giving unless my Monday morning doctor’s appointment takes too long, in which case Trey will give the presentation.
Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?
My progress is mostly on schedule. My illness and the change of plans to using magnets to move the pieces meant that I was not able to put enough time into finalizing the design of the through-board LED UI subsystem. I will park this aside for the time being and focus on the mechanical aspects of our system until the design report is due. Once that is working, I can go back to working on quality-of-life improvements.
What deliverables do you hope to complete in the next week?
Next week we will present our design presentation and receive the components we ordered. My main goal for next week is setting up the Arduino and figuring out how to use it to control the stepper motors.