This week I delivered our group’s design presentation and continued working on our algorithms for detecting people given infra-red imaging data. Overall, on this portion of the project I am on track as next week I plan to test the accuracy of the algorithms that we have so far using IR imaging datasets online.
This week I also began thinking about how our system will represent, store and keep track of what the robot’s surroundings are (some of the more detailed aspects of the path planning portion of our design). I think in this portion of the project, we are somewhat behind as it was pointed out to us in the design review that we have not considered some of the important details in the path planning portion of the project. To help move things along here, it seems likely that I will take up part of this portion of the project to work on in addition to IR data processing. Next week I will work on a detailed design/plan of how our system will take in information from the Lidar scanner about its environment, store the information it has about its environment, and use that information to navigate (as well as the exact policy/way in which it will navigate), and what functionality we may want to include in this portion in addition to just how the robot will explore a building’s floor/rooms.