Tarek’s Status Report for 4/19

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

This week, I finished writing the code for the final components of my subsystem I had left, the LED array and the keypad. Now that all of the components of the Embedded Controller are there, I have written an Arduino sketch that plays out a whole game with everything except gaze tracking input, using serial input instead. Code here.

I built a 4×4 LED array to test my code and it worked. See video. Expanding this circuit to 8×8 will require a lot of wiring, but the logic is no different, so it should work in theory.

I also designed the final three chess pieces in Fusion to 3D print them at IDeATe. With this, we can now print an entire set of chess pieces. We did run into an issue where some of the pieces were too heavy for the electromagnet to move, so we are attempting to print them in “vase mode”, where the pieces are virtually hollow, and therefore a lot lighter.

Finally, I started by validating certain components of my subsystem, such as the magnet, motor calibration, limit switches, keypad, and LED array (4×4, will validate 8×8 once built..

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I am right on schedule.

What deliverables do you hope to complete in the next week?

Aside from expanding the LED circuit from 4×4 to 8×8, all I have left to do is thoroughly test and validate my subsystem and the project, and write a short UART parser to integrate with the  gaze-tracking.

As you’ve designed, implemented and debugged your project, what new tools or new knowledge did you find it necessary to learn to be able to accomplish these tasks? What learning strategies did you use to acquire this new knowledge?

I hadn’t worked with Arduino in a long time, so I had to familiarize myself with the toolchain (working with additional files to the main script is fairly different than in C/C++). I had also never written code for certain components like magnets or stepper motors (except for a small part of a lab in 18-349). I had to look at some Arduino tutorial blog posts to pick up on how to this, although it wasn’t overly complex.

For the rest of the parts of the project, I pulled on what I learned over the last 4 years in ECE at CMU: to write clean, modular, self-documenting code for a wide variety of embedded devices. Implementing certain parts of the project was challenging, and figuring out how to do something from scratch would have been inefficient, so one learning strategy I used thoroughly is adapting something new to something I’ve done before. I may not have implemented certain parts of the project using the absolute best practices or methods, but I did implement them in a clean and efficient enough way that works and is familiar to me. This enabled me to understand and write code faster, as well as have an easier time debugging it.

Liam’s Status Report 4/19/25

Personal Accomplishments

This week I enhanced our gaze estimation accuracy by identifying a 3cm off-center error in our calculations. This discrepancy stems from the stereo camera setup – since we’re using the rectified image from the left camera, we can’t assume the image originated from between both cameras. I also began developing a screen implementation to facilitate integration testing. Additionally, I installed Parsec on my home computer in Missouri to conduct ML training, as I couldn’t find a suitable pre-trained model online. This particular model delivers superior accuracy for screen-based gaze tracking compared to alternatives, making it a low-risk decision that allowed meo to simultaneously progress on other aspects of the project.

If you are interested in how calibration works opencv has a good tutorial:

Here is a picture of me calibrating my webcam:

 

Progress:

I am still working on the gaze estimation onto the chessboard at the same time as doing the screen so I do not impede progress on my team

 

Future Deliverables

Switch to ETHx-Gaze dataset (still)

Switch over to Jetson (still)

Get Screen Estimation working for now

Trey Wagner’s Status Report for 4/19/25

Personal Accomplishments

1. Gantry Piece Movement and Lid (12 hr): Since I was finally able to build the box last week, I spent a lot of time this week measuring the top and testing piece movement. There were some slight issues with the top, as the thin piece of plywood (5mm) tends to bow due to the lack of support in the middle. I believe I came up with a solution as we will use a small “table” design inside the box that supports the top and can be lifted out for maintenance. A picture of the top will be placed below. This also allowed us to test the magnet and piece movement. I found that the magnet was able to move the pieces in both the x- and y-direction. A video will be linked below. This was really cool to see, and great to see with the full box assembled. The next step will be fine tuning the movement and testing with a full gameboard.

Chess Movement Video

2. Small Box Adjustments (2hr): I also made some slight adjustments to the box to make the design easier to use. First, all the electronics are inside the box (which hides them from view and cuts down on distractions). I also made the wiring cleaner, rather than the old tangle of wires we had before. Another such improvement was drilling holes in the side of the box to access the wires and power devices. Each hole was also labeled. This will make it even easier to test in the future without having to remove the whole top each time!

3. Mandatory Lab Sessions (4hr): During our class sessions this week, we once again had uninterrupted time to discuss the final details of our project and how we wanted to approach the upcoming deadlines. The weekly meeting with Professor Kim and Alex showed us that we needed to stay motivated and continue working to make sure we have a good demo for the final week.

4. Work on Final Presentation (3hr): The last thing I worked on this week was the final presentation. I put effort into building the slides and making sure they met the requirements. I will also be presenting these slides, so I took some time to practice certain details as I wrote them.

Progress

Once again, I feel that the gantry is basically done. I have to iron out a few details with some spacing and measurements, but the core functionality is all there. I plan to continue to solve these small issues throughout the week to finish out my part of the project!

Next Week Tasks & Goals
  1. Test movement on a full board with all chess pieces included.
  2. Show knight piece moving between two other pieces.
  3. Get chessboard design engraved in top of box.
  4. Give a good presentation!
New Tools & Knowledge

Looking back on the details of my portion of capstone, I realize that many details were mechanical in nature. The entire gantry is a very physical design composed of precise measuring, woodworking, assembly with various tools, and an eye for structural issues. I worked with electronics every now and then, but my part involved lots of tinkering to design a physical system. As such, I felt that I needed to gain the following skills:

  1. Woodworking: Although building a box doesn’t sound overly difficult, there were certain issues I had to deal with as I planned and assembled it. First, the absolute size of our box (roughly 43″ x 48″ x 3″) amplified the entire process. I relearned how to use a table saw and belt sanders. I would continuously visit wood shops around campus to get help and hear ideas from the staff there. The thinness of the top board also proposed some issues as the wood would bow easily, so I had to learn how to fix and support a bowing piece of wood to keep things straight. This was actually a very fun part of the project and I hope to do more woodworking in the future.
  2. Basic tinkering/making: Because our project has so many intricate details, I found myself having to be very creative when looking for solutions to our issues. I would often look at YouTube videos and maker forums by people who made projects with similar features. These would serve as inspiration for how to improve certain details that I did not know how to approach. I would also go to TechSpark and ask the staff if they had any ideas, as I learned they were very knowledgeful in these mechanical systems.
  3. Various tools: Most of the assembly involved tools like power drills, sanders, saws, and wrenches. While I didn’t have to learn how to use many of these things, I did have to change my understanding of how to use them effectively. I found myself becoming very hands-on for most of this project and using tools in ways I never had before.
  4. CAD tools: I had to design some 3D-printed parts for our project, which allowed me to become more familiar with some of the programs available for CAD. I had to read many online guides and forums to learn how to use these tools well.
  5. Embedded Design and Component Selection: Although Tarek did the Arduino code, I had to select the components that we used for this design (including the step motors, drivers, roller switches, etc.). I also had to gain an understanding of how my physical system would interface with some sort of logic controller (i.e. our Arduino). This led to some research into step motor control and pulley system design to optimize our control and motion. Thankfully, there are many guides and home projects online which walk through these processes.

Team Status Report for 4/12/25

General update
  1. Gantry progress has been good, although it was slightly delayed by issues with wood shops on campus. We have the box assembled and will be doing more testing over the coming days to verify proper movement of the chess pieces. We believe this part of the project is almost done! Just putting the finishing touches on the design now.
  2. Liam has been making good progress with the gaze detection, and he now has a better way to test the results as a person sits in front of the camera. He will continue to make this better over the coming days as he carries out more testing. This will also allow him to calibrate the model for better results overall.
Potential risks and risk management
  1. No new risks this week. We created a speech-to-text model that would act as a backup to the gaze estimation, although we are still confident in our ability to finish the gaze detection as expected. We believe the speech element would still meet our use case requirements as it does not require any physical motion by the user. This model has proven to be quite accurate as we speak into a computer microphone.
Overall design changes
  1. The one new design change is a small one. We will be using an NMOS in a small circuit to handle powering on the electromagnet. This is because the Arduino does not have a 5V output pin that can switch from high to low. We will use one of the 3.3V output pins to power the gate of an NMOS, which will act as a type of switch (although imperfect) to power on the electromagnet with the necessary 5V.
Schedule
  1. Our schedule does not have any major changes. We are ready to buckle down and do whatever work is necessary to finish this project and have a great presentation and demo at the end of the semester!

Validation

There are a few ways that we, as a team, plan to validate our design. This will look a lot at whether our project is still meeting the user needs.

  1. Revisit Initial Use Case Requirements: This will be talked about throughout the following points, but we want to go back and ensure we are hitting the use case we originally thought about. The idea was most clear at the beginning, and we want to be sure we did not stray from that concept.
  2. Accessibility of Gaze Model: We plan to bring in some of our friends and classmates to get a wide audience of people to test our camera model on. This will allow us to test all sorts of eye sizes and shapes to build a design that is accessible to as many people as possible. After all, our design sets out to make chess accessible to as many people as possible.
  3. Non-Physical Gameplay: As we carry out our testing, we want to make sure that every aspect of the gameplay is non-physical. This is to make sure that our system can be played by people who are not able to physically move certain aspects themselves, which is our entire use case. This means that the camera should not have to be adjusted, the pieces should be placed properly, and the electronics should work without physical interaction.
  4. Hidden Devices: One of the important needs that we set forward was an unobtrusive design. As we continue to assemble the system and test, we will be sure that all computational circuitry (and gantry components) are hidden away so that they do not detract from the game. This will continue to guide our decisions during assembly and placement.
  5. Speed of System: One major user need that we recognize is the speed of our system. We do not want moves to take too long for users to become uninterested and discouraged. Therefore, we plan to iterate during testing to improve any unneeded latency and make the game flow continuously as much as possible.
  6. Accuracy: Although this is mostly testing in verification, accuracy is the most crucial part of our user needs. If our system is not accurate, it will not be used. Therefore, as we go through testing, we will be sure that accuracy is at the forefront of our validation and make changes when necessary to prioritize this metric.
  7. Remapping our Stakeholders: As we look toward a completed design, we think it may be interesting to look back at the ethics assignment for this class. Are we considering the stakeholders properly? Are we keeping bias and our own pride out of our design? We want to be sure that we are still aligned with the people most affected by our system.

Trey Wagner’s Status Report for 4/12/25

1. Buying, measuring, cutting, and assembling wood (10hr): This week, I spent a lot of time trying to make the box that will go around the gantry system. This involved going to Home Depot to buy the wood, measuring out the exact dimensions necessary for our box, looking for options to cut the wood, and then assembling it after the cut. Unfortunately, TechSpark’s wood shop was closed this week, which led me to 4 different places until I finally found someone to help us cut. The sides were put together and the bottom was attached. There is empty space on the right and left for our circuitry to be placed.  The top will be placed on later as we still plan to adjust elements of the gantry when needed.                                                                                                                                 2. Basic Magnet Testing (4hr): Due to the external delays for cutting the wood, we had to wait to fully test some of the electromagnet functionality. However, in the meantime, we did some basic tests by propping up a piece of wood above the electromagnet at the same height as the box lid. We placed a magnet on top of the wood and turned the magnet on, then moved the gantry using the step motors. We found that the magnet moved smoothly in each direction. I also used some chess pieces that Tarek 3D printed, placed them on top of the magnet, and tested again to determine that we could move these chess pieces around the wood with the electromagnet. That was exciting to see!

3. Gantry Odds and Ends (2hr): I spent some time adjusting parts of the gantry that were slightly off, including making sure the corners were square. I adjusted the 3D-printed corner brace to ensure the corners would stay in this correct angle. I also spent some time cleaning up the wiring and creating a circuit for the electromagnet. The new plan is to use an NMOS to help power the electromagnet since the Arduino does not have a 5V configurable output pin. All of these changes should help to complete the overall design of the gantry.

4. Mandatory Lab Sessions (4hr): During our class sessions this week, I had an opportunity to work closely with Tarek to talk about some of the details about the Arduino-gantry interface. We also had some great conversations about the gaze detection, along with some risk mitigation plans in these final weeks. Most importantly, we got to meet with Professor Kim and Alex, who pushed us to continue working hard toward the final demo and prioritize the gaze detection.

Progress

I feel that the gantry is almost entirely done, minus a few small details. My progress was definitely stunted by the difficulty finding a wood shop to help us cut the wood. However, I plan to go in tomorrow to place the top on the box and test out movement on a larger space using the 3D-printed chess pieces. This will give a great indication of the functionality of our piece movement. I feel that I am still on schedule, although this time of the semester feels more rushed.

Next Week Tasks & Goals
  1. Test movement on a full board with all chess pieces included.
  2. Show knight piece moving between two other pieces.
  3. Get chessboard design engraved in top of box.
  4. Finish circuitry for small components and make a clean wiring design for all electronics involved.
Gantry Verification

Here is an overview of the plan and completed tasks for the gantry verification:

  1. Dimension iteration: One of the first elements of “testing” was an iteration of the height and thickness of the board that we would use as our chessboard. We ran various tests to see what material we could use and how far above the electromagnet it could go. We settled on a 5mm thick plywood piece that sits approximately 1/8″ above the electromagnet.
  2. Basic testing: This week, we carried out some basic testing to ensure that the electromagnet could move a chess piece through a wooden slab as the stepper motors controlled the movement. This is crucial given that all chess movements will be controlled by the Arduino, step motors, pulley system, and electromagnet. We saw consistent, smooth movements, regardless of the direction.
  3. Chess Movement Testing: With each type of piece, we will test all possible types of moves. Vertical, horizontal, and L-shaped moves will be tested to ensure that we can accurately move a piece for short or long distances. We are looking for consistent accuracy, with 70% of the chess piece base placed in the intended square.
  4. Knight Movement Testing: One of the most difficult movements to mimic is early-game knight behavior, as it often jumps over a row of pawns. Our solution will instead move the pawn between the pieces in front of it, which could lead to magnetic interference on the adjacent pieces. To ensure that this interference is minimized, we will test various movements between two other pieces. This worst-case scenario will confirm that these moves are possible. We want to ensure that all pieces more than 0.75″ away are not picked up.
  5. Full-Game Scenario Testing: After all of the “unit testing” is finished, it will allow us to test continuous moves that would be seen in a real chess game. We will set up all pieces in a normal start state, then test basic moves based on real chess games that we find online. Each move should grab the correct piece and move it to the intended position without disrupting nearby pieces.

Tarek’s Status Report for 4/12

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

I did a great deal of things this week. I expanded the Embedded Controller subsystem with a few more libraries, all available on our organization’s GitHub repo.

I added the Magnet, Gantry, LimitSwitch and ChessGantry libraries, and I am almost done with the GameState library. The Magnet library controls the electromagnet, the Gantry library builds on the StepperMotor library by allowing positioning of the gantry in two directions. The LimitSwitch library provides control to detect when a limit switch is being pressed (this is for calibration). The ChessGantry builds on and abstracts the Gantry class by providing an interface for chess squares as opposed to position in inches, it can also execute a move using the following code:

// Move to the origin square center and pick up piece

go_to_square(origin_row, origin_col);

go_to_square_center();

_magnet.on();

// Return to square bottom-right corner

go_to_square(origin_row, origin_col);

go_to_square_center();

// Move to the destination square center and drop off piece

go_to_square(dest_row, dest_col);

go_to_square_center();

_magnet.off();

delay(500); // Allow time for magnet to turn off

// Return to gantry origin

go_to_origin();

I added some Arduino sketches to take in user inputs for functions over serial to test out these libraries.

I also wrote a Python script that takes in user voice and detects chess moves e.g. “A1 to H8”. This is the worst-case backup in case we are unable to get gaze detection working for the final demo.

I also 3D printed some braces for the gantry to ensure it is perfectly squared. In addition, I designed and 3D printed some pawns, a queen, and a rook with a hole for the magnet at the bottom (see attached picture). These were used to test out the electromagnet.

Finally, today I helped Trey assemble the enclosure (everything except for the lid).

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I am right on schedule.

What deliverables do you hope to complete in the next week?

The main tasks I have left to do is write the short library to parse serial input from the Jetson (or the speech-to-text Python script as a backup), and wire up the LEDs and write code for that. After that, it will all be testing.

How will you analyze the anticipated measured results to verify your contribution to the project meets the engineering design requirements or the use case requirements?

So far, I have tested and calibrated stepper motor and gantry movement, to verify the number of inches it is intended to move is how much it actually moves. When calibrated, any given gantry move is within ±1% of the input position in inches. I have also written Arduino sketches to verify some of the smaller parts of my subsystem, such as the limit switches and magnet, and these work as intended. As soon as I am done writing the ChessState library, I will write an Arduino sketch and play 10 chess games on it, which I will use to validate the game state after each move function call is what it should be. Once we finalize the board this week, I will also be able to validate the ChessGantry library’s movement by using it to move a small magnet across the top of the board to a number of specified positions and measuring the error. Finally, I will integrate the Arduino with the Jetson/speech-to-text script, and validate that UART connection by measuring the difference in message sent and message received timestamps.

Liam’s Status Report 4/12/25

Personal Accomplishments

I was able to accurately determine where a user was located in space by calculating the angle from the camera and knowing the depth at a certain point. I also quickly made a small web app to visually debug the gaze estimation. While testing this out I realized that where the camera was located in comparison to the user played a big role in the estimation. For some depths it becomes hard to calculate gaze direction when looking right or the error seems to very big compared to the true gaze direction. I spent the last couple days reading a lot of research papers and found that there was a better dataset called ETHx-Gaze that has around 5 degrees of angular error compared to around 10 that Gaze360 has. The positional error could decrease from around 5.51 to 2.47 inches. I already have the new dataset working using my webcam so I would just have to integrate it with the stereo camera software.

Progress

If I can’t get the gaze estimation to work more accurately in the next day or two, we will have to consider switching to something like a screen to simplify the problem.

Verification:

Since putting LEDs on the board remains uncertain, I anticipate that verification for my subsystem will require significant manual testing. I’ll need to recruit users or volunteers to simulate gameplay scenarios. The process would involve instructing participants to look at specific squares, then using the previously mentioned web app to confirm their gaze is correctly detected. With LEDs implemented, this verification process could be more automated.

Future Deliverables

Switch to ETHx-Gaze dataset

Switch over to Jetson

Team Status Report for 3/29/25

General update
  1. After completing the gantry assembly last week, we were able to complete the pulley system after receiving the belts this week. As some of our individual posts show, Tarek and I worked on testing this completed system to look at movement and gain an understanding of the motors with the Arduino code. The video can be found here: Gantry Testing Video. This was a super encouraging piece of progress and will be a great starting point for our demo next week.
Potential risks and risk management
  1. No new risks this week. Still exploring the gaze estimation, but the communication and data received from the camera are encouraging so far. There have also been more updates on the depth estimation side which should give us better ideas of the risk involved.
Overall design changes
  1. No new design changes this week.
Schedule
  1. Our schedule hasn’t really changed since last week. Our demo will be ready by Monday and we are excited to present this to the students and instructors in the class!

Trey Wagner’s Status Report for 3/29/25

PERSONAL Accomplishments
  1. Setting up and Testing Gantry Pulley System (12 hr): Once the second timing belt arrived, I placed both belts onto the gantry to complete the assembly.                                                                                           As the image shows, the entire gantry is now assembled. I went through and continued to touch up the angles and tightness of the rails so that the middle bar rolled easily. This took some time to ensure that everything was ready for testing. Then, I worked with Tarek to help him set up his configuration to test his Arduino code with my gantry system. We saw great results in being able to control the gantry with great detail. The entire system moved smoothly and we were able to move it vertically, horizontally, and diagonally. An example video is found here: Gantry Movement Video
  2. Created braces for our gantry (2hr): I also spent some time designing some braces to 3D print that would maintain right angles in our design. This would ensure that our rails do not shift and pinch the movement. 
  3. Mandatory Lab Sessions (4hr): During our class sessions this week, we had an opportunity to continue to work together as a team. This helped to maintain our schedule and understand where our entire project stood. In particular, a meeting with Professor Kim helped to emphasize the importance of going above and beyond for our demo. We gained some motivation to not just do the bare minimum, but to aim for more functionality in our demo.
Progress

This week, I once again felt that I made some big progress by proving that the gantry design works and can facilitate movement. I feel that I am back on track and will now change my focus to prove the feasibility of chess piece movement on a full board.

Next Week Tasks & Goals
  1. Continue working on the design for our chessboard and pieces
  2. Test the movement of chess pieces on a few pieces
  3. Test movement with a full board

Tarek’s Status Report for 3/29

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

This week I made a great deal of progress. On Monday I wrote a basic Arduino sketch to test the limit switches we received and ensure they work as expected. This was successful. We’ll be using these for calibration.

I also spent some time designing the laser cut file to engrave our table. We’ll be purchasing a large piece of 1/8 in plywood, sawing it down to the right size, and engraving it using the IDeATe laser cutters.

I have begun by structuring all my code into libraries. I now have a stepper motor library. I tested this by writing an Arduino sketch to step both motors synchronously forward or backward by a number of steps entered by the user over serial input. This test was successful. In this test, we discovered that we will need one power supply for each motor driver.

Finally, once the second belt arrived Trey and I tested movement of the gantry using the stepper motor library. I wrote an Arduino sketch that let the user move the gantry by a positive or negative number of steps in the x and y direction sequentially. This test was also successful.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I am right on schedule.

What deliverables do you hope to complete in the next week?

I need to finalize the gantry library with better calibration for the number of steps per inch, so that we can have more accurate and precise gantry movement. I’ll also be helping Trey with some physical assembly.