This week, our team mainly focused on the integration of our overall system. We met together to integrate our entire system, and we tested the integration of laptop, UI, and FPGA. We have tested integration on both Mac OS and Windows to make sure our game supports both operating systems. The integration was successful, which allows communication between the laptop, UI, and FPGA. It is great news that the integration didn’t take a long time. So, for the following week, we will work more on integration and making the individual components better.
Team Status Report for 04/24/2021
This week, our team got together to integrate everything. We were able to integrate the UI with the computer vision, so we only need to integrate the FPGA with our system, to recommend moves. We are on our track in the gantt chart. Michael is mostly done with the stockfish code to connect it to the system. Joseph and Jee Woong worked together to fix any bugs that the system had and to show everything successfully on the UI. Next week, we are planning to finish integrating everything and finalize our product. All the team members are on track with the gantt chart.
Joseph’s Status Report for 04/24/2021
This week, we were able to fix most of our problems. The system is now able to detect all chess moves and is ready to be demonstrated. One awkward problem we have is that the chessboard is flipped visually, but all the pieces actually move like they do. Another problem that we mentioned in Slack is that the Raspberry Pi cannot handle the workload of pygame, as shown in our pictures, so we think we cannot use it for our final demo.
We are on schedule with the Gantt chart. We were supposed to finish the integration of UI with the CV this week, and we were able to do it.
Next week, we are going to finish integrating with the FPGA and hopefully only worry about presenting our product.
Below are some successful background subtraction and blob detection that our system can do now.
Jee Woong’s Status Report for 04/24/2021
Joseph and I mainly focused on the integration of Raspberry Pi and UI. After we integrated, we found an issue that the Raspberry Pi is not good enough to run Pygame and Computer Vision algorithms using videos. So, we thought the performance might improve if we use photos instead of videos.
Thus, I created a button that a user can press after he or she makes a move, which will take a photo of the board and compare it with the previous status of the board to figure out the difference between the two frames. Although the performance was slightly better on Raspberry Pi, it still couldn’t handle Pygame.
So, we decided to use our laptop instead of the Raspberry Pi so that we could have a smoother game. Now, we have a button which user can press after he or she makes a move, and Joseph and I tested with our board that the entire game works well with the buttons.
Michaels Status Report for 4/24/2021
This week we worked together to integrate all of our components together. For me, we have integrated the FPGA together with the UI such that the UI is able to send moves over the UART communication channel and the FPGA HPS is able to process the input in the correct way.
A blockade that we had to overcome was an outdated version of python which the FPGA HPS came with so I had to reinstall python3.7 by compilation. I had a few other compilation issues having to do with armv7 with stockfish and other programs that needed to run on the FPGA HPS but I have solved those now.
We are mostly done with our integration and mostly now just focus on some of the features which we would like to implement such as the suggestion of multiple moves by stockfish. I am on schedule with my part of the integration and it seems everyone is doing well as well.
Updated Gantt Chart
Team Status Report for 04/10/2021
Joseph has finished his background subtraction algorithm to detect the movement of pieces, and Jee Woong finished writing code for UI and Computer vision integration. One issue on Joseph’s side was that the pieces weren’t detected frequently, so we decided to color the pieces for now and order a colored version of chessboard pieces. As we are finished with simple UI, board detection, and piece movement detection, we are ready to integrate. Jee Woong has prepared the integration, so Joseph and Jee Woong will be mainly testing the integration code next week. Michael has been working on dealing with edge cases. And, he will also dive into integration part of the project as the integration of UI and CV finishes.
Jee Woong’s Status Report for 04/10/2021
This week, I mainly worked on the integration of the Computer Vision part of the project and the User Interface part of the project. As I have completed the representation of the initial board status, I tried to integrate the background subtraction and the board UI. So, I wrote the integration code to combine the board detection with UI and piece movement detection with UI.
Next week, I will try to test if the integration was successful with Joseph by trying with the Raspberry Pi that just arrived. As our team members started working on integration, I believe we can finish the project in time.
Michael’s Status Report 4/10/2021
This week I continued to work on the edge cases for legal move generation. Specifically the cases for en passant is tricky because it requires knowledge of the exact previous move and the move is a capture which does not capture on the square which the capturing piece moves to. Hence, there a quite a few extra components to that logic than originally anticipated.
Otherwise, progress on SV and FPGA seems to be going well. I will demo a quick testbenches of a few board states on Monday using the waveform viewer. After that the plan for this week is to finish the legal move generation including edge cases and begin integration with the FPGA and HPS.
Joseph’s Status Report for 04/10/2021
This week, I was able to fix most of the problems. Now, the board detection works fine. The piece detections also work fine as well, with some constraints. Now that the board is distinctively black and white, the pieces blend too well with the board. As a result, the blob detection is not able to detect the pieces very well. To mitigate this, we have bought pieces that are red and blue so that we can distinguish the pieces from the board. I have tested that this works by coloring checker pieces red. Here is an example of blob detection when a black piece moves from a black square.
The top case where the knight moves from a black square, it cannot detect the knight properly.
As for the schedule, I am on schedule with the Gantt chart. Now it is time for integration after the interim demo.
Next week, I am going to get started on the Raspberry Pi to see how Jee Woong’s coordination with the UI plays out and just hopefully make the whole process as smooth as possible.