My main focus this week was on learning Gazebo and ROS. I was able to get a simulated robot which uses the same iRobot base and similar depth camera to move around via keyboard commands published in ROS. I was also able to get gmapping working which allows the robot to better localize itself and map its surroundings. This is useful since the sensor readings and motion commands used in ROS are agnostic to whether we use a simulator or physical robot. As a result, getting the code that works for the simulator to work on the real robot should be an easier process. Another benefit of the simulator is that the sensing modules and physics simulations are very close to real life which means we can efficiently debug our robot code without access to the robot.
I also worked on ordering parts, detailing risk factors, and creating a detailed block diagram for the design presentation.
Next week I hope to get a decent path planning algorithm to work in the simulator and to get the simulated robot to detect and move to objects with markers on it.