JP Weekly Status Report 5/1/2021

This week, Logan and I worked in TechSpark most of the time. We were integrating the launcher with the cup detection modules and placing it in the housing. Logan tuned the motors and fabricated the motor base and servos. I helped him by laser cutting the loader. We actually went thru 5-6 different designs by the time we found one that was accurate. For a while, we thought that a ramp would work. We quickly found out that the angle of the shot was very unpredictable with a ramp loader. We finally settled on a much smaller loader, that slowly feeds the ball into the spinning DC motors with a servo. This design was the most accurate and consistent. It also allowed Logan to easily implement an automatic reload system.

Old Ramp Loader:

 

Final Loader:

I was also able to calibrated the cup detection module, so we could use it in different environments. I created a simple command-line tool to run the cup-detection, targeting, and launching E2E. We feel like we are in a very good place since we have been able to make cups, and our data from testing is very consistent! Check out Logan’s status report for some videos of our progress.

JP Weekly Status Report 4/25/2021

I spent this week finalizing the calibration process for detecting cup locations. I reorganized the repo to include Juan’s UI and and its dependencies, along with all of our external libraries. The calibration tool can now take a picture of a mat with the circle locations of a set of cups drawn on it, then calibrate the ellipse detector by mapping each circle location to the index of the cup it represents. This mapping of indexes to 2D coordinates is then saved in a JSON file so that the Automatic Gentleman can load this data in when it boots up. When the bot first takes a picture of the cup rack, it will compare the 2D coordinates of each detected ellipse with the calibrated data and map each detected ellipse to its correct index. Then the game will be able to communicate which cups to display to our UI.

 

Example calibration map (hand drawn for testing purposes):

Example Calibration Data:

{“locations”:{“0”:{“x”:2404,”y”:1058},”1″:{“x”:1654,”y”:1038}}}

 

JP Weekly Status Report 4/12/2021

This week I made a lot of progress on the calibration settings for our robot. I have decided to separate the calibration flow from the actual gameplay flow in order to make testing and setup more flexible.  The calibration flow can detect a rectangular playing mat and and identify the main cup locations by detecting black points within the playing mat. It then will save the calibration data (pixel coordinates of each detected point) to a file which will be loaded in by the gameplay program at runtime. Here is an example of running the calibration flow on a playing mat image:

It will then save the 2D pixels coordinates for each point, indexed from 0 to 9, for a 10 cup formation. The gameplay flow will then compare the detected cup pixel coordinates to these calibrated coordinates so that the UI can display the cup formation state to user.

I was using a test image I created to do the initial testing of the calibration tool. This week I am getting a full sized printout of the playing mat to test with the calibration tool.

By the end of this week, the calibration tool should be working end to end with a 10 cup formation. We should be able to calibrate, detect cups, select a cup, physically aim at a cup, and shoot a pong ball (not necessarily hit the cup perfectly yet).