Tarek’s Status Report for 4/12

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

I did a great deal of things this week. I expanded the Embedded Controller subsystem with a few more libraries, all available on our organization’s GitHub repo.

I added the Magnet, Gantry, LimitSwitch and ChessGantry libraries, and I am almost done with the GameState library. The Magnet library controls the electromagnet, the Gantry library builds on the StepperMotor library by allowing positioning of the gantry in two directions. The LimitSwitch library provides control to detect when a limit switch is being pressed (this is for calibration). The ChessGantry builds on and abstracts the Gantry class by providing an interface for chess squares as opposed to position in inches, it can also execute a move using the following code:

// Move to the origin square center and pick up piece

go_to_square(origin_row, origin_col);

go_to_square_center();

_magnet.on();

// Return to square bottom-right corner

go_to_square(origin_row, origin_col);

go_to_square_center();

// Move to the destination square center and drop off piece

go_to_square(dest_row, dest_col);

go_to_square_center();

_magnet.off();

delay(500); // Allow time for magnet to turn off

// Return to gantry origin

go_to_origin();

I added some Arduino sketches to take in user inputs for functions over serial to test out these libraries.

I also wrote a Python script that takes in user voice and detects chess moves e.g. “A1 to H8”. This is the worst-case backup in case we are unable to get gaze detection working for the final demo.

I also 3D printed some braces for the gantry to ensure it is perfectly squared. In addition, I designed and 3D printed some pawns, a queen, and a rook with a hole for the magnet at the bottom (see attached picture). These were used to test out the electromagnet.

Finally, today I helped Trey assemble the enclosure (everything except for the lid).

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I am right on schedule.

What deliverables do you hope to complete in the next week?

The main tasks I have left to do is write the short library to parse serial input from the Jetson (or the speech-to-text Python script as a backup), and wire up the LEDs and write code for that. After that, it will all be testing.

How will you analyze the anticipated measured results to verify your contribution to the project meets the engineering design requirements or the use case requirements?

So far, I have tested and calibrated stepper motor and gantry movement, to verify the number of inches it is intended to move is how much it actually moves. When calibrated, any given gantry move is within ±1% of the input position in inches. I have also written Arduino sketches to verify some of the smaller parts of my subsystem, such as the limit switches and magnet, and these work as intended. As soon as I am done writing the ChessState library, I will write an Arduino sketch and play 10 chess games on it, which I will use to validate the game state after each move function call is what it should be. Once we finalize the board this week, I will also be able to validate the ChessGantry library’s movement by using it to move a small magnet across the top of the board to a number of specified positions and measuring the error. Finally, I will integrate the Arduino with the Jetson/speech-to-text script, and validate that UART connection by measuring the difference in message sent and message received timestamps.

Liam’s Status Report 4/12/25

Personal Accomplishments

I was able to accurately determine where a user was located in space by calculating the angle from the camera and knowing the depth at a certain point. I also quickly made a small web app to visually debug the gaze estimation. While testing this out I realized that where the camera was located in comparison to the user played a big role in the estimation. For some depths it becomes hard to calculate gaze direction when looking right or the error seems to very big compared to the true gaze direction. I spent the last couple days reading a lot of research papers and found that there was a better dataset called ETHx-Gaze that has around 5 degrees of angular error compared to around 10 that Gaze360 has. The positional error could decrease from around 5.51 to 2.47 inches. I already have the new dataset working using my webcam so I would just have to integrate it with the stereo camera software.

Progress

If I can’t get the gaze estimation to work more accurately in the next day or two, we will have to consider switching to something like a screen to simplify the problem.

Verification:

Since putting LEDs on the board remains uncertain, I anticipate that verification for my subsystem will require significant manual testing. I’ll need to recruit users or volunteers to simulate gameplay scenarios. The process would involve instructing participants to look at specific squares, then using the previously mentioned web app to confirm their gaze is correctly detected. With LEDs implemented, this verification process could be more automated.

Future Deliverables

Switch to ETHx-Gaze dataset

Switch over to Jetson

Team Status Report for 3/29/25

General update
  1. After completing the gantry assembly last week, we were able to complete the pulley system after receiving the belts this week. As some of our individual posts show, Tarek and I worked on testing this completed system to look at movement and gain an understanding of the motors with the Arduino code. The video can be found here: Gantry Testing Video. This was a super encouraging piece of progress and will be a great starting point for our demo next week.
Potential risks and risk management
  1. No new risks this week. Still exploring the gaze estimation, but the communication and data received from the camera are encouraging so far. There have also been more updates on the depth estimation side which should give us better ideas of the risk involved.
Overall design changes
  1. No new design changes this week.
Schedule
  1. Our schedule hasn’t really changed since last week. Our demo will be ready by Monday and we are excited to present this to the students and instructors in the class!

Trey Wagner’s Status Report for 3/29/25

PERSONAL Accomplishments
  1. Setting up and Testing Gantry Pulley System (12 hr): Once the second timing belt arrived, I placed both belts onto the gantry to complete the assembly.                                                                                           As the image shows, the entire gantry is now assembled. I went through and continued to touch up the angles and tightness of the rails so that the middle bar rolled easily. This took some time to ensure that everything was ready for testing. Then, I worked with Tarek to help him set up his configuration to test his Arduino code with my gantry system. We saw great results in being able to control the gantry with great detail. The entire system moved smoothly and we were able to move it vertically, horizontally, and diagonally. An example video is found here: Gantry Movement Video
  2. Created braces for our gantry (2hr): I also spent some time designing some braces to 3D print that would maintain right angles in our design. This would ensure that our rails do not shift and pinch the movement. 
  3. Mandatory Lab Sessions (4hr): During our class sessions this week, we had an opportunity to continue to work together as a team. This helped to maintain our schedule and understand where our entire project stood. In particular, a meeting with Professor Kim helped to emphasize the importance of going above and beyond for our demo. We gained some motivation to not just do the bare minimum, but to aim for more functionality in our demo.
Progress

This week, I once again felt that I made some big progress by proving that the gantry design works and can facilitate movement. I feel that I am back on track and will now change my focus to prove the feasibility of chess piece movement on a full board.

Next Week Tasks & Goals
  1. Continue working on the design for our chessboard and pieces
  2. Test the movement of chess pieces on a few pieces
  3. Test movement with a full board

Tarek’s Status Report for 3/29

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

This week I made a great deal of progress. On Monday I wrote a basic Arduino sketch to test the limit switches we received and ensure they work as expected. This was successful. We’ll be using these for calibration.

I also spent some time designing the laser cut file to engrave our table. We’ll be purchasing a large piece of 1/8 in plywood, sawing it down to the right size, and engraving it using the IDeATe laser cutters.

I have begun by structuring all my code into libraries. I now have a stepper motor library. I tested this by writing an Arduino sketch to step both motors synchronously forward or backward by a number of steps entered by the user over serial input. This test was successful. In this test, we discovered that we will need one power supply for each motor driver.

Finally, once the second belt arrived Trey and I tested movement of the gantry using the stepper motor library. I wrote an Arduino sketch that let the user move the gantry by a positive or negative number of steps in the x and y direction sequentially. This test was also successful.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I am right on schedule.

What deliverables do you hope to complete in the next week?

I need to finalize the gantry library with better calibration for the number of steps per inch, so that we can have more accurate and precise gantry movement. I’ll also be helping Trey with some physical assembly.

Liam’s Status Report 3/29/25

Personal Accomplishments

This week, I had a midterm and presentation, so I couldn’t afford to spend as much time as I wanted to on the capstone. There seemed to be a weird issue with the images. I think since this a point cloud image, I have to use the depth numpy array to properly create a normal looking image.

 

Progress

Booth and other classes have put me slightly behind. I will put in a lot of work later tonight and tomorrow to get a proper working MVP for the interim demo.

Future Deliverables

MVP gaze demo

UART for arduino

Team Status Report for 3/22/25

General update
  1. This week, we finally got the gantry system assembled! This was an exciting step, and a picture will be attached below. The final step is attaching the timing belt to fully set up our pulley system, as well as wiring up the electrical systems that will control the gantry. Unfortunately, we have to wait for another timing belt to be delivered due to some size underestimates on our part. This should arrive on Monday, allowing us to finish up the entire system and begin testing early in the week. Trey and Tarek will work together to polish the final pieces of the design and then carry out testing on the basic movement mechanics of the system throughout the next week.                          
  2. Due to the nature of the gantry assembling, Tarek’s ability to test was limited. Therefore, he pivoted to finalize some of the details for our LED system, chess logic, and other peripherals. In particular, he was able to isolate and purchase some shift registers (74HC595) for the LED circuitry. The Arduino code is already written for the gantry movement, and it will be tested in more detail this week. There is also a new GitHub repository holding all of this code.
Potential risks and risk management
  1. No new risks this week. Still exploring the gaze estimation, but the communication and data received from the camera are encouraging so far. In the absolute worst-case scenario where we are left with little time to pivot, we would shift to an automated chess movement system where the user inputs their move and the piece is automatically moved.
Overall design changes
  1. The LED circuitry design for our project changed to use the 74HC595 shift register instead of the MAX7219 LED driver. Other than that, no other major design changes occurred this week.
Schedule
  1. We still expect to have a demo of the gantry system movement by our meeting with Professor Kim on Wednesday. The assembly of the gantry was an encouraging achievement that should enable various parts of our testing. As such, we did not have to make any major changes to our schedule. We plan to put in extra hours (when necessary) this week to stay on track for the upcoming demo deadline.

Trey Wagner’s Status Report for 3/22/25

PERSONAL Accomplishments
  1. Assembling Gantry System (15+ hr): The entirety of my time this week was set aside to assemble the gantry system.                                                                                    As the image shows, the bones of the gantry system are all assembled. This involved a lot of measurements, layout, precise angles, and assembly with screws, nuts, and sockets. The middle bar can move freely left and right due to the wheels on the 3D-printed assembly holding it up. The trolley system in the middle was also 3D printed and carries the electromagnet for our design. It can move freely up and down, allowing motion in both the X- and Y-directions. Lots of small adjustments had to be made throughout the assembly to ensure the bars were not pinched or angled in a way that limited movement. Due to some issues with the timing belt, we have to wait for another to be delivered before finishing the entire pulley system. Once that arrives (hopefully Monday), we can finish the entire system and begin testing with Tarek’s Arduino code. This was an exciting piece of progress! It was very fulfilling to see the design begin to come to life, and I am excited to watch it perform during our initial testing.

2. Mandatory Lab Sessions (4hr): During our class sessions this week, we had an opportunity to explore the ethical implications of our capstone project. This was a very helpful exercise to map out our stakeholders, determine risks, and manage the tensions that could form over time. I believe this was extremely valuable insight, especially during the red teaming exercise which identified some of the key user values that we may be ignoring.

Progress

This week, I felt that I made significant progress and met a major goal for our design. However, we just missed the goal of having a demo of the basic movements for the gantry system. As such, I still feel slightly behind. I plan to work more on Sunday and Monday to polish the design and test the movement with Tarek. This will put us back on track and allow us to present a basic demo to Professor Kim in our Wednesday meeting.

Next Week Tasks & Goals
  1. Test basic movements (from point A to point B) for consistent accuracy
  2. Determine circuit/logic for roller switches that calibrate the gantry system
  3. Continue working on the design for our chessboard and pieces

Tarek’s Status Report for 3/22

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

I started off the week by working with the IDeATe staff to get our parts printed, and to be able to print parts quickly in the future. I 3D printed the pieces for the gantry and the camera stand and delivered them to the group. Given that I still couldn’t test my Arduino motor code until the gantry was fully set up, I started working on later tasks, such as the chess logic, overall main routine, LEDs, and additional peripherals.

I chose and ordered some shift registers, 74HC595s in particular, to control the chessboard LEDs. These are a better choice than the previously chosen MAX7219 LED driver because they can more accurately and responsively light a single LED at a time, as the MAX7219 uses row and column scanning, while the shift registers can simply set a row and column high. For a system where users are constantly looking around and high responsiveness is necessary, this is a better choice and worth the extra wiring. I also designed the circuit for this.

I also created a GitHub repository (I wasn’t able to do this while in Europe because 2FA wouldn’t let me log in), and uploaded commits of my previous Arduino code versions. I added a main routine that I will be using to guide the design of the other embedded controller libraries (motor control, magnet control, LED control, keypad, and chess logic) .

Finally, I assisted Trey with assembling the gantry, and continued working on the chess logic file.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

Still bottlenecked by not being able to test the Arduino motor code, so I am behind on the gantry motion control. However, I spent time getting ahead on parts I was not meant to start later, so I am not too far behind overall.

What deliverables do you hope to complete in the next week?

Now that the gantry is finally assembled, I will be testing the Arduino motor code over the next two days, and then refining the motion of the gantry throughout the board (e.g. moving pieces between other pieces and recalibrating gantry after every move).

Liam’s Status Report 3/22/25

Personal Accomplishments

I used the zero MQ python library to have the gaze estimation script and the camera script communicate using a queue. Later tonight I’m going to find a screw for the 3d printed stand that we now have. Doing some research on available gaze models I came across some research on github showing how 9 calibration samples could improve accuracy by 5%. This might be something we explore in the future.

Progress

I think I’ve brought myself back to schedule by having the two parts of our software communicate with each other.

Future Deliverables

Cleaning up code and gaze model

Maybe add some training data?