This week the main things I worked on were setting up the new L515 Lidar Scanner and running slam_toolbox and Nav2 in simulation. I was able to get the data from the Lidar scanner displayed on my laptop. There was a lot of time spent debugging the Lidar scanner software since the current versions of the Intel RealSense software are not compatible with the L515, so I had to find and download an older version of the software. For simulation, I was able to run a Turtlebot3 simulation in Gazebo on the Raspberry Pi in ROS2 Jazzy. I worked on setting up slam_toolbox and Nav2 in this simulation but have not yet finished this as I am still working on errors related to not having the correct reference frame for the simulated robot’s odometry.
I am currently behind schedule as by this point I had wanted to have the SLAM subsystem and path planning fully working and only need to integrate, test, and improve it. To catch up I plan to continue working on the project over Thanksgiving break in the time that I am on-campus. By the end of next week I hope to have the SLAM and path planning subsystems fully working, with the Lidar data and odometry processed and input to the SLAM and path planning algorithms.
Over the course of the semester, I have learned a lot about setting up a Raspberry Pi, the ROS2 development environment, and setting up a Lidar scanner. I did not know very much about any of these beforehand so I had to learn about them to do my parts of the project. The main tools and learning strategies I used to acquire this knowledge were reading the documentation for these components, watching tutorial Youtube videos, and asking AI models for setup guides and debugging help. I also learned by trying different setups and methods and seeing what worked and what caused problems.
