Tarek’s Status Report for 4/12

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

I did a great deal of things this week. I expanded the Embedded Controller subsystem with a few more libraries, all available on our organization’s GitHub repo.

I added the Magnet, Gantry, LimitSwitch and ChessGantry libraries, and I am almost done with the GameState library. The Magnet library controls the electromagnet, the Gantry library builds on the StepperMotor library by allowing positioning of the gantry in two directions. The LimitSwitch library provides control to detect when a limit switch is being pressed (this is for calibration). The ChessGantry builds on and abstracts the Gantry class by providing an interface for chess squares as opposed to position in inches, it can also execute a move using the following code:

// Move to the origin square center and pick up piece

go_to_square(origin_row, origin_col);

go_to_square_center();

_magnet.on();

// Return to square bottom-right corner

go_to_square(origin_row, origin_col);

go_to_square_center();

// Move to the destination square center and drop off piece

go_to_square(dest_row, dest_col);

go_to_square_center();

_magnet.off();

delay(500); // Allow time for magnet to turn off

// Return to gantry origin

go_to_origin();

I added some Arduino sketches to take in user inputs for functions over serial to test out these libraries.

I also wrote a Python script that takes in user voice and detects chess moves e.g. “A1 to H8”. This is the worst-case backup in case we are unable to get gaze detection working for the final demo.

I also 3D printed some braces for the gantry to ensure it is perfectly squared. In addition, I designed and 3D printed some pawns, a queen, and a rook with a hole for the magnet at the bottom (see attached picture). These were used to test out the electromagnet.

Finally, today I helped Trey assemble the enclosure (everything except for the lid).

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I am right on schedule.

What deliverables do you hope to complete in the next week?

The main tasks I have left to do is write the short library to parse serial input from the Jetson (or the speech-to-text Python script as a backup), and wire up the LEDs and write code for that. After that, it will all be testing.

How will you analyze the anticipated measured results to verify your contribution to the project meets the engineering design requirements or the use case requirements?

So far, I have tested and calibrated stepper motor and gantry movement, to verify the number of inches it is intended to move is how much it actually moves. When calibrated, any given gantry move is within ±1% of the input position in inches. I have also written Arduino sketches to verify some of the smaller parts of my subsystem, such as the limit switches and magnet, and these work as intended. As soon as I am done writing the ChessState library, I will write an Arduino sketch and play 10 chess games on it, which I will use to validate the game state after each move function call is what it should be. Once we finalize the board this week, I will also be able to validate the ChessGantry library’s movement by using it to move a small magnet across the top of the board to a number of specified positions and measuring the error. Finally, I will integrate the Arduino with the Jetson/speech-to-text script, and validate that UART connection by measuring the difference in message sent and message received timestamps.

Liam’s Status Report 4/12/25

Personal Accomplishments

I was able to accurately determine where a user was located in space by calculating the angle from the camera and knowing the depth at a certain point. I also quickly made a small web app to visually debug the gaze estimation. While testing this out I realized that where the camera was located in comparison to the user played a big role in the estimation. For some depths it becomes hard to calculate gaze direction when looking right or the error seems to very big compared to the true gaze direction. I spent the last couple days reading a lot of research papers and found that there was a better dataset called ETHx-Gaze that has around 5 degrees of angular error compared to around 10 that Gaze360 has. The positional error could decrease from around 5.51 to 2.47 inches. I already have the new dataset working using my webcam so I would just have to integrate it with the stereo camera software.

Progress

If I can’t get the gaze estimation to work more accurately in the next day or two, we will have to consider switching to something like a screen to simplify the problem.

Verification:

Since putting LEDs on the board remains uncertain, I anticipate that verification for my subsystem will require significant manual testing. I’ll need to recruit users or volunteers to simulate gameplay scenarios. The process would involve instructing participants to look at specific squares, then using the previously mentioned web app to confirm their gaze is correctly detected. With LEDs implemented, this verification process could be more automated.

Future Deliverables

Switch to ETHx-Gaze dataset

Switch over to Jetson