Nathalie’s Status Report for 3/9

This week I spent a lot of time working on the design report and performing experiments in order to determine the best ways to detect planes, elaborating on the use case requirements and the technical design requirements. This involved writing descriptions for the ethical, social and environmental  considerations, thinking about how our tool interacts with the world around us. Specifically, for the use case requirements, I delved deeper into the main categories of mapping coverage, tracking coverage, and dirt detection cleanliness, adding an additional section for battery life coverage after discussing this need for our group. We wanted to make sure that the vacuum was actually a usable product, and that the Jetson would have sufficient battery life to actually be an operational product.

Experimenting with ARKit plane detection

We solidified our metrics for each of the technical design requirements through experimentation. By downloading the ARKit plane detection demo on to my iPhone, I was able to see what this looks like in different room environments. I tested the plane detection augmented reality algorithm in different rooms: the ECE capstone room, my living room, stairs, and kitchen. By testing this in different environments I was able to observe the accuracy of the plan detection in the midst of different obstacles and objects that were present in each room. Not only did this app detect the surface area and corners of planes, but it also labelled objects: Wall, Floor, Table, Ceiling. Most of the random objects that exist are mapped with an area and labelled as Unknown, which is acceptable for our purposes. 

.     

Since the ARKit plane detection has memory, I realized how much the initial mapping/scanning of the room actually matters in the accuracy of the edge detection. In the pictures on the left, I did an initial zoomed out scan of the room before narrowing into a specific corner whereas on right hand side I did not map the room first. We can see the difference between the accuracy present in the edges between the floor and the wall – the left picture has a much higher accuracy. Hence, we have accommodated for user error in our quantitative error margins of ± 100%. We also tested this difference in various iPhones, particularly the iPhone 13 Pro and the iPhone 14 Pro Max. The iPhone 14 Pro Max was more accurate because of an updated LIDAR sensor.

I also did more research into the related work, comparing and contrasting how different existing products act as partial solutions for our product. While there are a lot of potential AR applications, this is a relatively new field so a lot of the development has yet to be fully materialized, and mostly involve expensive products like the Apple Vision Pro and the Meta Quest. We are trying to accomplish similar ideas but in a more accessible way.

Much of our augmented reality experimentation is on track, and our next steps involve seeing if we can track objects and map them in the plane. In addition, we need to figure out how the hardware is going to be placed (specifically the camera) to make sure that we can set up the software with a solidified hardware environment. Much of our post-break work is going to involve implementation and making sure that we encounter our roadblocks early on rather than later.

Leave a Reply

Your email address will not be published. Required fields are marked *