Shanel’s status report 4/5
For the demo, we plan to exhibit a full circle of data transfer (not necessarily in real time). It will start with the robot moving based on instructions we give it. It will take in odometry and lidar scan data that will feed into BreezySLAM. BreezySLAM will give us back a final point bytearray generated as a map. We then feed this map as a pgm fil into our navigation algorithm which generates a path and set of instruction back to the roomba.
This week, I worked with the team to program the first half of that process. Aditi was controlling the hardware, while Alex and I debugged from remote. This included the collection of roomba odometry and lidar scan data, and passing that into BreezySLAM to generate a map. This was our first time testing it on continuous live data and the SLAM algorithm needed tweaking. We eventually settled on manually passing in odometry data and tweaking the minimum number of samples needed to generate a map.
Though I am conscious of the time left, I think we will be able to finish the project in time. However, I am not sure we are able to meet the quality set out by the original project requirements. I think we really work best during the zoom sessions when we are able to give immediate feedback to each other. There have been numerous times when we planned to implement something or use a specific package, but found it incompatible.