Juan Pablo Weekly Status Report 3/27/21

I coded up a basic UI model that supports cup selection, manual marking/unmarking of cups, and not allowing the user to choose cups that have already been made.

I’ve also made some progress on the game flow system that will keep track of the game state. Making these is relatively simple but what I thinkĀ  will be more complicated is ensuring the different systems are able to communicate and share data effectively which we will look to start working on now.

This week I’m going to continue working on the game state system and will look into the calibration map/image idea and how we can best integrate that into our UI.

JP Status Report 3/27/2021

I spent the first half of the week converting our build engine to CMake from MSBuild. I was testing our application on my laptop until the Jetson arrived, so MSBuild from Visual Studio sufficed at the time. CMake allows us to cross compile for Windows x64 (laptop) and Linux ARM64 (Jetson Nano). This took a lot longer than I expected due to dependency issues with opencv specifically. Now I can test the camera and ellipse detection on the Jetson. The application runs smoothly on the Jetson when I ran our testbench on it. The second half of the week I spent enhancing the coordinate transformations so that I can send that data to the Arduino. Using the data from the camera intrinsics and extrinsics, as well as the depth map data, I can map 2D image coordinates, to the 3D position coordinates so the launcher knows where to aim relative to the camera’s position. Below is a picture describing image projection from a 2D image to a 3D space.

 

I also started to derive the calibration technique to help the UI pinpoint which cups in a given formationĀ have been identified. We decided to use a map with dots along the 4 edges. These dots will connect to form horizontal and vertical lines that intersect. These intersections pinpoint predefined cup locations for specific formations. This is how our application will identify specific cups to select in the UI. Below is an example of our calibration map.

This coming week I plan on working almost exclusively on testing our detection algorithm with the Jetson Nano. The housing should be dry enough to mount the camera on it so I can do some real-time testing on the table we will be using. Previously I have been using a tripod to mount the camera, but now I will get a more accurate representation of the exact camera angle we will be taking images from. By the end of the coming week our application should be able to:

  1. take images
  2. detect cups
  3. get 3D cup coordinates
  4. link detected cups to predefined cup targets for selection

Team Status Report 3/27

This week JP and Logan linked up in person to begin the fabrication process. We laser cut our external housing and modeled the internals. The external housing is cut from 1/4″ plywood with 8x8x21 inch dimensions (external length). We decided that it would make fabrication and mounting the camera easier if we constructed the housing as a rectangle and just made the internal parts with an internal circle that can freely rotate. Currently it is clamped and we are giving the glue time to dry properly.

We also discussed an algorithm for displaying cups in the proper place on our UI. We decided we need to make a simple calibration map/image so the camera and place the cups in pre-defined cup locations based on each formation. As seen in the image below, our calibration map will line up cups based for far way a cup ellipse center is relative to the closest intersecting lines.

The codebase is fit to cross compile for windows x64 and linux arm64 so this will make integration and testing easier since we use both personal laptops and the jetson nanos to run our application.