Yoorae’s Status Report for 12/4

I did not get to do a lot of work this week because I was not in a very good condition for many days in the week. This week, I visited the lab several times to take photos of the edge cases (including when the poor quality photo, castling, and checkmate state) and test it. I added a helper to fully handle a castling move in our main file, and I updated test cases on castling for the chess game logic. There were some confusion on the coordinate since we changed our camera orientation, so I checked the coordinate conversion between function for consistency. I am planning on investing a lot more time on capstone for tomorrow and next week. I plan to go to the lab tomorrow to take more samples to test, and planning on working with everyone on finalizing integration.

Demi’s Status Report for 12/4

This week I mainly worked on making the full game work. I added a function for user to press the push button after moving for the AI, to allow our CV to compare two frames between the human player’s move. I added code for the human player to press the button at the start of the game to capture the initial board state. I also made some changes to how the LEDs light up and tested them. After human and computer players’ moves, we check for the game end state. When it is checkmate, all the LEDs light up in the color of the winner of the game, either red or blue. I also unified the paths for the captured and cropped images. There also was an issue with our new webcam setup and image orientation, which I fixed by rotating the images to the correct orientation before handing it over to the CV pipeline.

Next week I plan to do end to end testing and work on the final video and report.

Team status report for 12/04

This week we worked on finalising our integration and doing e2e testing. Yoorae worked on adding castling and correctness check to coordinate conversion. Demi and Anoushka worked on making the full game and retries work. Demi also added a push button for the user to press after moving the AI and Anoushka added logic to update the internal state after this.

We are on track and do not see any significant risks. We will be working on generating final metrics and writing the final report this week.

Anoushka’s status report for 12/04

This week was not very productive for me because I was very sick Sunday- Thursday. I managed to get some more testing done for CV, including figuring out timing and accuracy. I also added code to update the game state once the AI move is returned from stockfish. There were some issues with coordinates in stockfish vs our board and the orientation of our pictures with the new webcam stand, but it is done now. I also added skeletal logic for retries, which I will insert into the main codebase when Demi adds the push button for retries. Overall however, I am still on track and working on testing. I plan to test more e2e runs this week before the demo. I will also spend time on the final report and video.

Yoorae’s Status report for 11/20

This week I spent time in class editing errors in integration mostly related to coordinates. I also added a logic from python chess library that checks if ‘checkmate’ has reached to signal the user that the game has ended. I edited an error on check valid castling from chess game logic.

I also spent most of my time writing exhaustive testing for chess logic to add onto the metrics for our final presentation. I have written 90 test cases, with varying board states. For each chess piece type (6 types in total), I wrote five valid move tests, five invalid move tests, and five capturing move tests. Along the way, I am recording an error I have found, and editing the code to take account of the edge case. So far, bishop’s left downward diagonal move and bishop’s en passant capturing move had errors and I have updated the code. I am planning on adding more test cases to detect more errors I haven’t found.

I am planning on adding more test cases including invalid capturing moves to detect more edge cases I haven’t considered and helping Anoushka with testing CV tomorrow. I will also spend next week on preparing for final presentation.

Team Status Report for 11/20

This week the team continued to work on integration and started testing to get metrics for the final presentation. Yoorae updated the valid logic module to handle castling and fixed errors in integration regarding coordinates. Demi integrated LEDs, push button, webcam, and Stockfish AI to the game. Demi and Anoushka spent a significant amount of time fixing an issue with installing the neural network for cropping images on the RPi. Yoorae tested the valid move logic, and Anoushka tested CV correctness.

There are no serious risks, and we are on track with the schedule. Next week, we will work on the final presentation and do more testing.

Demi’s Status Report for 11/20

This week I finished integrating the LEDs, push button, webcam, and Stockfish AI with the game controller. We now have the game working end to end from push button to webcam taking pictures of board to CV detection, lighting up user move, validating user move, and finally generating and lighting up AI’s next move.

I spent most of my time this week on integration issues for running the neural net submodule on RPi. After spending many hours in the lab trying different solutions with Anoushka, we were able to make it work by upgrading tensorflow. I also spent time setting up the webcam with the new webcam stand and taking images for CV testing.

Next week, I will be working on the final presentation and generating metrics for LEDs.

Anoushka’s Status report for 11/20

This week I tested CV move detection. I tested with around 27 images of moves during a chess game and got all of the moves correctly detected. This was with images before we got the new webcam stand, so I will need to change thresholds and verify again this week.

I spent a significant amount of time solving issues with package compatibility this week. First, there was an issue with matplotlib  after my OS update that caused a segfault whenever I tried to display anything. It took me some time to fix this. I also spent a lot of time in the lab with Demi trying to get the neural network to work. The installation was significantly easier on my mac and I ran into a lot of issues with the Rpi. We tried to install the dependancies but we weren’t able to because the python version was not compatible with tensorflow. We ran into issues with changing the python version because for some reason it wouldn’t update even if we used a conda environment and set both global and local to the right version. We then tried using pyenv but we still couldn’t get all the versions of tf, np, scipy etc to match up. After a lot of work, we were finally able to get everything to match up and find versions that are compatible. We then tested the project end to end.

Next week, I plan to generate more metrics for CV (including time, we haven’t worked on those yet). I also plan on doing some e2e testing. We will also devote time to the final presentation.

Demi’s Status Report for 11/13

My main goal for this week was to have the board completely done. I successfully installed the LED matrix and connected the push button to the board. The WS2812B LED strips that I initially used were too weak, so I had to switch to a different LED strip WS2811. I chose WS2811 because they are stronger and do not require any soldering. I also had to redesign the board completely, laser cut the pieces, and assemble them. I wrote a script that takes coordinates as inputs and lights up the corresponding LEDs. This was shown during Wednesday’s demo.

I installed Stockfish AI on our RPi. I also wrote an example script to enter moves and get the next best move of the AI. Lastly, I have set up the webcam with the RPi. I downloaded the necessary libraries and wrote a shell script that captures an image with the specified resolution.

I am on schedule. I have tested all the modules I have been working on individually: LEDs, push button, webcam, Stockfish AI. Next week, I will focus on integrating them with the CV and valid move logic module.

Yoorae’s status report for 11/13

This week I focused on making the CV work on our project environment. The difference detection of our code depended on absolute difference values from RGB color code, and the current black and white pieces did not show significant difference between the difference value on a square where the change has been made, and where the change has not been made. To make the absdiff value more distinct for the squares where a change has been made, I made an order for red colored chess piece set (that distinct from green and white squares) to decrease the error rate of the difference detection. I finished an algorithm for change detection with red pieces and updated it before our interim demo. Red pieces showed significantly lower error rate when detecting changes.

I also visited the lab on Thursday and Friday and took more sample photos from webcam. So far, we did not have a web cam stand and relied on photos that we took from our phone to test our CV. Since the webcam for our project produced photos with significantly lower resolution, we needed to make an update on our CV. On Saturday, I met up with Anoushka to make edits on edge detection because our CV was not detecting every vertical and horizontal lines in the photos. We made update on houghline thresholds, change detection threshold, and rectifying algorithm to make the CV work with the photos produced from webcam with lower quality.

I have also started on integration, on unifying coordinate outputs of CV and valid move logic so that it will work well with the stockfish AI. I will be focusing on translating coordinate system of our project for integration on Sunday, and will begin to work on piece identification CV algorithm when our newly ordered black chess pieces arrive. When user makes a capturing move, a color of chess piece has to be identified before updating a board state. I am planning on comparing RGB or HSV value to identify black and red pieces.