Final Report
Final report, last updated May 5, 2020. Team_B2
Final report, last updated May 5, 2020. Team_B2
Here is a link to our final presentation, given to Section 2 earlier this week: https://docs.google.com/presentation/d/1yigIsbeaICcAaOPwGWH7EM3ge098cm2YGi6YGCCXt0w/edit#slide=id.g83f711c7c0_0_246
This week was focused on 1.) integration of the program in real time and 2.) optimizations to the navigation and path planning We connected all the components of the project together into a multithreaded program joining the Roomba, the lidar sensor, SLAM, and navigation system. This entailed writing control flow to make sure the program status was being updated as expected, and data was being passes smoothly between each thread. One by one, we were able to add each thread…
This week, we focused on integration of each of our separate components together. The majority of the work we’re doing at this point is integration through pair programming. The part that I programmed this week was the control flow of each of the three threads (obstacle detection, SLAM/path planning, path execution) happening at the same time. For the end condition, we decided that the program would be complete when it is enclosed by walls on all sides and has mapped…
This week, we spent a lot of time working together and integrating the system. All our work was done together over zoom calls. We hit several milestones – – We integrating the two main parts of the project we were working on thus far. I pulled in the navigation/path planning code that Shanel and Alex had, and we merged it with the roomba movement/slam code I had on the raspberry pi. – We met the milestone of getting the slam…
This week, I worked almost entirely in meetings with Shanel and Aditi. We integrated all the subsystems into one central program. This program runs each module in a separate thread: Main thread for control flow and navigation SLAM thread for reading LIDAR data and doing SLAM stitching Movement thread for giving commands to the robot Obstacle thread to interrupt movement when obstacle bumped The SLAM thread is constantly running and updating the map, but we don’t do additional navigation processing…
I dedicated our efforts this past week to the software side of our project. First, I implemented A* search to generate a path for the robot to a previously selected destination. I used the Manhattan Heuristic so we could have a simple representation of the bearings of the robot as one of : North, South, East, and West. The path is represented as a list of points for the robot to navigate to next. As a MVP, we made the…
In the first half of this week, I wrote a script that took commands in real time to move the robot, and simultaneously supplied these as odometry measurements to the SLAM, and at the end of a 30 second time frame saved the generated map. The position of the roomba is displayed as a series of points. Here, the contours of the room in the area that the robot managed to explore, including the bottom left corner and the walls…
One major risk uncovered recently is the unreliability of the hardware connection to the Roomba. We have previously experienced issues sometimes with the serial connection port or the Roomba not responding to its provided interface commands, but those issues were resolved with a power cycle. On Friday, we experienced this issue persisting through multiple power cycles, software changes, connection configurations, and everything else we could think of. We eventually decided to try again later. When implementing the navigation module, we…