Final Report
Final report, last updated May 5, 2020. Team_B2
Final report, last updated May 5, 2020. Team_B2
Here is a link to our final presentation, given to Section 2 earlier this week: https://docs.google.com/presentation/d/1yigIsbeaICcAaOPwGWH7EM3ge098cm2YGi6YGCCXt0w/edit#slide=id.g83f711c7c0_0_246
This week was focused on 1.) integration of the program in real time and 2.) optimizations to the navigation and path planning We connected all the components of the project together into a multithreaded program joining the Roomba, the lidar sensor, SLAM, and navigation system. This entailed writing control flow to make sure the program status was being updated as expected, and data was being passes smoothly between each thread. One by one, we were able to add each thread…
This week, we focused on integration of each of our separate components together. The majority of the work we’re doing at this point is integration through pair programming. The part that I programmed this week was the control flow of each of the three threads (obstacle detection, SLAM/path planning, path execution) happening at the same time. For the end condition, we decided that the program would be complete when it is enclosed by walls on all sides and has mapped…
This week, we spent a lot of time working together and integrating the system. All our work was done together over zoom calls. We hit several milestones – – We integrating the two main parts of the project we were working on thus far. I pulled in the navigation/path planning code that Shanel and Alex had, and we merged it with the roomba movement/slam code I had on the raspberry pi. – We met the milestone of getting the slam…
I dedicated our efforts this past week to the software side of our project. First, I implemented A* search to generate a path for the robot to a previously selected destination. I used the Manhattan Heuristic so we could have a simple representation of the bearings of the robot as one of : North, South, East, and West. The path is represented as a list of points for the robot to navigate to next. As a MVP, we made the…
For the demo, we plan to exhibit a full circle of data transfer (not necessarily in real time). It will start with the robot moving based on instructions we give it. It will take in odometry and lidar scan data that will feed into BreezySLAM. BreezySLAM will give us back a final point bytearray generated as a map. We then feed this map as a pgm fil into our navigation algorithm which generates a path and set of instruction back…
This week I migrated our web application form plain JavaScript to React. I initially built it using Cloudinary to store picture and videos that would be transmitted from the Pi, but ultimately decided to use our Amazon credits for EC2 and S3 instead. Now, our web application runs on an EC2 instance. The Pi will transmit the map onto the EC2 instance, where the web app will look for the images and videos to display. I also wrote our updated…
This week we were able to start connecting some of the individual parts of our project. We’ve starting to link the frontend and backend of our web application, and export the odometry data of the Roomba to our processing and navigation algorithm. We ran into quite a bit of trouble installing Cartographer ROS onto the Pi, because there were many unresolved dependencies that are poorly documented. We were aware of Cartographer’s package problems before spring break, and spent an additional…