This week I worked on integrating the webserver with the motor control and openCV. I ran into some trouble with websocket communication and am having trouble with getting the virtual chess board to render when the webserver is run from the Pi. I will be working to fix this issue tomorrow so it will be ready for us to shoot our demo video on Monday. I will also be working with my teammates on the final design report this week.
This week we made great progress on the fine tuning of different subsystems. We also worked on the integration, as of now the Pi and the Arduino are able to correctly communicate moves which need to be completed.
The web server is being finished up this weekend, and then integrated with the pi. All that’s left is finalizing the website, and then testing the fully integrated system. After that we plan on working on the video, the poster, and the final paper.
This week I’ve mostly finished up the OpenCV. It is now fully capable of transmitting moves to the Arduino, waiting for a signal showing that the move has been completed, and then continuing on.
I’ve helped refine the gantry controls, it is now capable of fully completing the xy movement for a move. However, there is still a couple parameters that need to be manually determined for the claw motion, as the smaller pieces require a lower drop.
I’ve partially helped with setting up the webserver. I don’t have much web development knowledge, so my contribution here has been much less. I’ve helped debug a few issues that arose, and have been helping to test the webserver on the RaspberryPi.
These next couple days I plan on helping to finish up those components, and finish integrating everything together. Then complete the video, poster, and paper.
This week I continued to improve the parameters for the gantry movement. We now are able to return to our home position after each move. I also worked with Luis to establish the communication protocols between the OpenCV and the Arduino. The OpenCV now sends the commands it detects causing the Arduino to execute them. Upon execution the Arduino sends an ACK signal to the Pi indicating the move has been completed.
During the last two days I am working on precision and finalize the communication with the Web Server and Arduino.
This week I have been working on finalizing the web application that will sending and receiving moves to and fro the Raspberry Pi on the gantry. I initially ran into some trouble with getting the board state to sync across multiple open webpages, but this ultimately turned out to be an issue with the instances of the chessboard. On loading the webpage a new chessboard was initialized instead of being stored in a database which didn’t allow other pages to access its data. Now we are storing the chessboard data in the database and updating the board via the instance stored in the database allowing multiple webpages to maintain identical board states.
This week I plan to finish the web application and test it with the Raspberry Pi on the gantry.
This week we have been working to finalize the subsystems to improve overall accuracy and functionality. This week we are working together to ensure that the subsystems are transmitting data properly between one another, so that it is ready for the demo.
This week I have been working on modifying the input logic for the whole gantry movement. Movement has changed from number of squares and direction to move to taking into square positions in order to improve integration of subsystems.
I have also been improving precision by fine tuning parameters for our board and pieces. This week I will be working to finalize the communication between the Raspberry Pi and the Arduino.
This week began with more work refining the OpenCV algorithm. The progress has been very promising. Some small additions were added such as not doing calculations while there is movement within the image, and the calculation of board coordinates involved in a move. We also did some preliminary testing on communication between the RaspberryPi and the Arduino to ensure that portion will work smoothly, the connection seems to work.
Currently there are a few cases where the algorithm calculates more than two coordinates as being involved in a move. My current plan to fix this is mostly some hyperparameter tuning, though if this does not suffice I will add an extra layer of processing to get crisper results. By the end of the week I hope to have moved our current processing and program into a more refined codebase that also communicates with the webserver and arduino.
This week I worked on the Arduino control code. Through the Arduino’s serial monitor we are able to input different commands to control our X-Y movement, our elevator and the claw movements. All of these can be controlled in sequence by inputting their respective commands and a number of squares to move in the desired direction.
This coming week I will be working on tuning parameters to improve precision and work on communication between the subsystems.
This week I continued refining the OpenCV. I was able to work out the math in order to get all points in the board. I’ve also worked on refining the method for movement detection. I’ve changed the image used in comparison to be a grayscale version modified to remove shadows. Overall this has shown to be promising, as the clusters we see after movements are more closely grouped together into the correct squares.
My next plans involve continuing to modify this algorithm such that the clusters become grouped together so that we can have precise points associated with movement. I also plan to modify it such that movement is calculated not only after a set amount of time, but also after the image stabilizes. This will help counteract some issues that arise while testing, where a hand obscures the cameras vision of the board.