JP Weekly Status Report 4/3/2021

This week I finalized and rigorously tested the cup detection algorithm. With the camera about 3 feet high, angled down about 25 degrees. It can accurately detect the 10 cups in the a pyramid formation in under 1 second. It then uses the data from the depth map to project the 2D image into real world 3D coordinates. I placed the cups in predefined positions so I could test the accuracy of the coordinate generated. It was accurate within +/- a few millimeters. This was a major milestone for this part of the project. I then cleaned up the repository to be fit for repetitive testing and validation of cups in various formations.

Here is a picture of one of my early testing setups:

I also started to put together the first prototype of our housing with the camera on it. Here is a picture of the housing so far:

Creating the calibration map to link the 3D coordinates to specific cups (as seen in last week’s status report) took a little longer than expected. I prioritized getting the cup detection algorithm to our MVP to make testing and demoing smooth and robust. It will also make the calibration much easier to test this week as I focus on that section of our project for this coming week.

JP Status Report 3/27/2021

I spent the first half of the week converting our build engine to CMake from MSBuild. I was testing our application on my laptop until the Jetson arrived, so MSBuild from Visual Studio sufficed at the time. CMake allows us to cross compile for Windows x64 (laptop) and Linux ARM64 (Jetson Nano). This took a lot longer than I expected due to dependency issues with opencv specifically. Now I can test the camera and ellipse detection on the Jetson. The application runs smoothly on the Jetson when I ran our testbench on it. The second half of the week I spent enhancing the coordinate transformations so that I can send that data to the Arduino. Using the data from the camera intrinsics and extrinsics, as well as the depth map data, I can map 2D image coordinates, to the 3D position coordinates so the launcher knows where to aim relative to the camera’s position. Below is a picture describing image projection from a 2D image to a 3D space.

 

I also started to derive the calibration technique to help the UI pinpoint which cups in a given formation have been identified. We decided to use a map with dots along the 4 edges. These dots will connect to form horizontal and vertical lines that intersect. These intersections pinpoint predefined cup locations for specific formations. This is how our application will identify specific cups to select in the UI. Below is an example of our calibration map.

This coming week I plan on working almost exclusively on testing our detection algorithm with the Jetson Nano. The housing should be dry enough to mount the camera on it so I can do some real-time testing on the table we will be using. Previously I have been using a tripod to mount the camera, but now I will get a more accurate representation of the exact camera angle we will be taking images from. By the end of the coming week our application should be able to:

  1. take images
  2. detect cups
  3. get 3D cup coordinates
  4. link detected cups to predefined cup targets for selection

Team Status Report 3/27

This week JP and Logan linked up in person to begin the fabrication process. We laser cut our external housing and modeled the internals. The external housing is cut from 1/4″ plywood with 8x8x21 inch dimensions (external length). We decided that it would make fabrication and mounting the camera easier if we constructed the housing as a rectangle and just made the internal parts with an internal circle that can freely rotate. Currently it is clamped and we are giving the glue time to dry properly.

We also discussed an algorithm for displaying cups in the proper place on our UI. We decided we need to make a simple calibration map/image so the camera and place the cups in pre-defined cup locations based on each formation. As seen in the image below, our calibration map will line up cups based for far way a cup ellipse center is relative to the closest intersecting lines.

The codebase is fit to cross compile for windows x64 and linux arm64 so this will make integration and testing easier since we use both personal laptops and the jetson nanos to run our application.