Erin’s Status Report for 3/16

The focus of my work this week was calibrating the height and angle at which we would mount the Jetson camera. In the past two weeks, I spent a sizable amount of my time creating a procedure any member of my group could easily follow to determine the best configuration for the Jetson camera. This week, I performed the tests, and have produced a preliminary result: I believe that mounting the camera at four inches above floor level and a forty-five degree angle will produce the highest quality results for our dirt detection component. Note that these results are subject to change, as our group could potentially conduct further testing at a higher granularity. In addition, I still have to sync with the rest of my group members in person to discuss matters further.

Aside from running the actual experiments to determine the optimal height and angle for the Jetson camera, I also included our active illumination into the dirt detection module. We previously received our LED component, but we had run the dirt detection algorithms without it. Incorporating this element into the dirt detection system gives us a more holistic understanding of how well our computer vision algorithm performs with respect to our use case. As shown by the images I used for the testing inputs, my setup wasn’t perfect—the “background”, or “flooring” is not an untextured, patternless white surface. I was unable to perfectly mimic our use case scenario, as we have not yet purchased the boards that we intend to use to demonstrate the dirt detection component of our product. Instead, I used paper napkins to simulate the white flooring that we required in our use case constraints. While imperfect, this setup configuration suffices for our testing.

Prior to running the Jetson camera mount experiment, I had been operating under the assumption that the output of this experiment would depend heavily on the outputs that the computer vision script generated. However, I realized that for certain input images, running the computer vision script was wholly unnecessary; the input image itself did not meet our standards, and that camera configuration should not have be considered, regardless of the performance of the computer vision script. For example, at a height of two inches and an angle of zero degrees, the camera was barely able to capture an image of any worthy substance. This is shown by Figure 1 (below). There is far too little workable data within the frame; it does not capture the true essence of the flooring over which the vacuum had just passed. As such, this input image alone rules out this height and angle as a candidate for our Jetson camera mount.

Figure 1: Camera Height (2in), Angle (0°)

I also spent a considerable amount of time refactoring and rewriting the computer vision script that I was using for dirt detection. I have produced a second algorithm which relies more heavily on OpenCV’s built in functions, rather than preprocessing the inputs myself. While the output of my test image (the chosen image from the Jetson camera mount experiment) against this new algorithm does appear to be slightly more noisy than we would like, I did not consider this an incredibly substantial issue. This is because the input image itself was noisy; our use case encompasses patternless, white flooring, but the napkins in the image were highly textured. In this scenario, I believe that the fact that algorithm detected the napkin patterning is actually beneficial to our testing, which is a factor that I failed to consider the last couple of times I had tried to re-tune the computer vision script.

I am slightly behind schedule with regard to the plan described by our Gantt chart. However, this issue can (and will) be mitigated by syncing with the rest of my group. In order to perform the ARKit Dirt Integration step of our scheduled plan, Nathalie and Harshul will need to have the AR component working in terms of real-time updates and localization.

Within the next week, I hope to help Nathalie and Harshul with any areas of concern in the AR component. In addition, I plan to start designing the camera mount, and place an order for the Jetson camera extension cord, as we have decided that the Jetson will not be mounted very close to the camera.

Leave a Reply

Your email address will not be published. Required fields are marked *