JP Weekly Status Report 4/12/2021

This week I made a lot of progress on the calibration settings for our robot. I have decided to separate the calibration flow from the actual gameplay flow in order to make testing and setup more flexible.  The calibration flow can detect a rectangular playing mat and and identify the main cup locations by detecting black points within the playing mat. It then will save the calibration data (pixel coordinates of each detected point) to a file which will be loaded in by the gameplay program at runtime. Here is an example of running the calibration flow on a playing mat image:

It will then save the 2D pixels coordinates for each point, indexed from 0 to 9, for a 10 cup formation. The gameplay flow will then compare the detected cup pixel coordinates to these calibrated coordinates so that the UI can display the cup formation state to user.

I was using a test image I created to do the initial testing of the calibration tool. This week I am getting a full sized printout of the playing mat to test with the calibration tool.

By the end of this week, the calibration tool should be working end to end with a 10 cup formation. We should be able to calibrate, detect cups, select a cup, physically aim at a cup, and shoot a pong ball (not necessarily hit the cup perfectly yet).

Juan Pablo Status Report 4/10/2021

This week I fully converted the UI to be solely in C++ and operational through linux on a tablet. I managed to retain all the core features and added extra details such as not allowing reracks until there is the exact amount of cups needed left on the board.

This week I’m going to integrate some game state steps in the form of a progress bar that visually keeps track of the stage of the game i.e game start, running ellipse detection, waiting for player shot, shooting, game over, etc. I’m also looking to further improve the overall aesthetic quality of the UI.

Team Status Report 4/3/2021

This week, we made a lot of progress on our independent work streams. The housing prototype has been assembled and is ready to be used for testing. Currently JP, is testing the cup detection algorithm with the camera positioned how we would like for our demo. The housing has the support on the rear panel for the touch screen UI that Juan is implementing, and space to house the internal launching mechanism that Logan is building. We believe it is sufficient enough for our demo, but there will likely be small changes as we continue to test.

We are nearing a point where we can test the integration of our independent workstreams. For our demo, we plan to integrate mostly the cup detection and the launcher, since those are the most difficult parts. By the end of this week we should have the following ready to demo:

  • Detecting cup rings
  • Filtering out erroneous ellipses
  • Generating 3D point locations of each detected cup
  • Mapping detected cup to calibration map (linking cup position to cup number 1,2,3 etc.)
  • Rotating launcher to aim at specific cup
  • Triggering ball launching at specific cup