Nathalie’s Status Report for 4/19

This week’s tasks

I spent this week working on integrating our AR components together: the image detection and tracking + offset for the back Jetson camera that we demonstrated  during our weekly meeting. This was a proof of concept that we could actually map the fixed area offset between our frontal camera positioning and the back of the Jetson using the airplane image. This image can be substituted out, but serves as a reference point. I’ve also sketched out the hardware laser cutting acrylic that needs to be completed for the mounts.

On top of that, I’ve been working on finding the optimal way to create a path. Our initial proof of concept was to use a SCNLine module type, which essentially draws a line on the plane. After further researching, I discovered another potential module type that we could use to indicate the covered area called SCNPath. The declarations are relatively similar, but they have different UI aspects and plane drawing memory capabilities. Specifically, SCNLine provides more granular control over individual segments of the path and draws the points in a single line, whereas SCNPath allows drawing of different segments which could be easier to color (the parallel task that I’m also working on). In the below images, you can see the differences between radiuses and width and the visual representation that it gives on the screen. I think I prefer the SCNPath visuals, but the backend itself and line management system is different.

Figure 1: SCNLine with a radius of 0.05
Figure 2: SCNLine with a radius of 0.01
Figure 3: SCNPath with a width of 0.02

 

 

 

 

 

 

 

I’ve also worked on drawing a path in order to indicate its level of dirtiness. To accomplish this, I delved into experimenting with various metrics of the SCNPath and SCNLine attributes. I found that it lacked some flexibility needed to retroactively modify attributes like color based on real-time data. By changing attributes within the materials of these node objects, we are able to the color of the path. These different colors are important because they are the user indication of the varying levels of dirtiness. I can easily change the different colors of the path but doing so in a patched-way (changing line segments, drawing different lines of different colors) has been more challenging than I expected.

Green color changes in SCNPath

As shown below, I’ve been able to edit the code such that it colors different segments and can capture the segmented floor, unfortunately, there seems to be some gapping betweent he segments when turning. This is demonstrated in the images below, but the effect is still present which serves our use case. I’m going to look further into how I can mediate those gaps going forward, but the positive side is that the coloring does not seem to affect latency.

Coloring with SCNLine
Coloring with SCNLine (2)
Coloring with SCNPath

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

In addition to path coloring, I worked closely with Erin and Harshul on hardware integration, designing our final mount for the vacuum and actually putting the pieces together. We worked a lot in TechSpark this week and have needed each of our subsystem pieces to make progress since we are in the refinements/mounting/integration stages of our process. I fixed some bugs with reference to the freezing plane logic, and also refined the UI to actually reflect the work that we are doing rather than what was part of our initial prototype.

Next Week’s Tasks, Schedule & Challenges

Harshul and I have been working with Erin to make sure that the Bluetooth part is making progress. Integration has been really difficult for us due to the setbacks from last week of the Jetson being broken, but we are trying our best to troubleshoot and make progress now that it’s been resolved. I need to polish up the path coloring logic, and connect that to the actual messages that are being received by the Bluetooth. This is really hard, especially because it depends on the Bluetooth connection integration actually performing, and the Bluetooth has not been as reliable as we had hoped. We got something working this week that needs to be fully flushed out and refined, so our next steps are defining specific test cases to actually test the accuracy of our mapped mount system.

In addition to the technical challenges we face with Bluetooth integration and hardware mounting, there are several other factors to consider as we move forward with our project. We planned the testing and validation of the integrated system, but need to actually perform this testing so that we can make sure our system performs as we had initially set out in our use case and technical requirements.. This includes testing the reliability of the Bluetooth connection under various dirt conditions and scenarios to ensure completeness, and making sure it can perform at a reasonable speed. To get this done, we need to collaborate and communicate as a team members to troubleshoot any obstacles that may arise along the way because we each had a hand in developing some part of a subsystem so all expertise is required.

Given the time constraints leading up to our final demo, it’s essential to prioritize tasks and allocate resources efficiently. This may involve making tough decisions about which features are essential for the core functionality of our system and making these decisions really quickly. This is evident in our hardware mount decisions which are designed specifically for the demo rather than an “industry standard” prototype. We have backup Bluetooth communication protocols and hardware configurations that may offer better performance or reliability, but are not the most optimal in terms of design and technical specifics.

Leave a Reply

Your email address will not be published. Required fields are marked *