Soren’s Status Report for Nov. 1

This week, I mostly worked on learning about how to connect the thermal camera that we are using to the Raspberry Pi so that we can collect thermal imaging data in the kind of environment and with the kind of data that our system will actually be operating in. I am hoping that next week I will have successfully connected the camera with the Pi (there is a module needed for this that should arrive next week) and collected a good amount of data to test our vision algorithm. I am currently behind on this part of the project because I did not realize that there was another module that we would need to order to connect the camera, but if I can finish this part of the vision component of our system next week, then I will be back on track.

Andy’s Status Report for November 1

This week, I adjusted the powering plan and the motor control plan. I am now using 18650 batteries only to power the whole robot. Powering RPi5 is actually not an easy problem. Its standard current input is 5A. Though the minimum current requirement is 3A, any unstabilility at this current level would cause RPi5 to black out. And unfortunately, most power banks are only 3A. As a result, I decided to order an UPS board to power the RPi5. I also decided to use L298N as motor controller for more stability.

Now, I have fixed the problems in the design of the motor system and are ready for assembling the robot. I am behind the schedule now, but I would work on the project a lot tomorrow and should be able to catch up a significant part of the progress. By putting a little bit more work next week, our group should have a basic working robot model at the interim demo.

Team Status Report for November 1

The main risk at this point is that we have not yet tested and integrated our systems, but we are hoping to have a full system initially working for our live demo in a bit over a week in order to demonstrate a mostly functional initial version of the project. To manage this risk we plan to start testing and integrating our systems as soon as possible (over the weekend and early next week) so that we have time to deal with unexpected issues before our initial project demo.

Since last week, there have been several minor design modifications in terms of the hardware and software that we will be using. We ordered an additional part for interfacing between the Raspberry Pi and the thermal camera, and a few more parts for setting up the physical robot. For software, we changed the SLAM and local path planning software that we will be using since our previously selected software was not compatible with current, maintained versions of ROS. We will now be using SLAM Toolbox instead of Hector SLAM, and Nav2 instead of TEB for local path planning.

Jeremy’s Status Report for November 1

This week I worked on setting up ROS, SLAM, and the local path planning algorithm on the Raspberry Pi. I successfully installed ROS2 Jazzy Jalisco on the Raspberry Pi. For the SLAM and local path planning, I ran into the problem that our previously chosen software of Hector SLAM and TEB are not compatible with ROS2 but only with ROS1, which has been deprecated and thus cannot be used. So, I found new software that works with ROS2 Jazzy Jalisco. For SLAM, I will be using SLAM Toolbox and for local path planning I will be using Nav2. I have begun setting up this new software with the Raspberry Pi.

I am still a little bit behind schedule since I had also wanted to connect and test the Lidar scanner with the SLAM algorithm by this point. To get back on schedule I plan to finish setting up the new SLAM and local path planning algorithms over the weekend and also do the Lidar scanner test over the weekend, so that I can spend time next week on the next steps. By the end of next week, in addition to setting up the SLAM and local path planning and testing the Lidar scanner, I also plan to finish integrating these with the robot controls so that we can do a full test of the system before the live demo.