Kevin’s Status Report for Nov 3. 2024
This week, I worked with Nick to get ROS 2 and the LiDAR to work. We managed to get ROS 2 Humble to run on the Jetson. We had previously thought that we were going to use ROS 2 Melodic, but we into ran into issues with getting it to run. We then found out that ROS 2 Melodic was meant to be ran on Ubuntu 18.04 rather than Ubuntu 22.04 which was the linux distribution on our Jetson Orin Nano. Following that, we tried using a github library to install ROS 2 Humble, but it was overcomplicating it by using docker containers which we did not thing that we needed. Hence, we finally got ROS 2 to work after going through some ROS 2 documentation on their official website. After that, getting the RP LiDAR to work also cost us a lot of time. While trying to get the RP LiDAR to “launch,” we kept running into issues. We then slowly worked through it and found that there was a permission issue on the USB port and then later an issue with the baud rate that we were using. But after this hard battle, we managed to get ROS and the LiDAR to work together and managed to visualize it in RViz.
Here is an image I took of Nick looking at the visualization in RViz!
Some potential problems that we may run into is primarily the range of the LiDAR. We found that if an object was too close to the LiDAR, it would not be able to detect it. However, this might be okay because the radius that cannot be detected seems to be the radius of our robot body. We thus, must end up putting the lidar in the middle of the robot rather than at one edge.
In terms of schedule, I think we have caught right back up to it and everything is looking good! I believe Nick is almost done making a map with the point cloud data, and my path planning algorithm is also pretty much done. The next step would be mainly optimization and just integration.
Kevin’s Status Report for Oct 26. 2024
This week, I personally managed to get the Jetson to work by upgrading the firmware and flashing the SD card with the right version of Jetpack. Getting it to connect to the Wi-Fi was a big issue, but we managed to solve it. After that, I decided to work on a simulation path finding algorithm as professor Bain had suggested. I have attached a gif that I generated for a room with a table , to show how a robot would potentially go about the room. I used a DFS + Backtracking Coverage algorithm as professor Bain had suggested, but I personally think that it is too slow, and can be further optimized.
I also decided how we should go about representing the graph—by using 1 and 0s where 1 is a traversable grid square and a 0 is a non-traversable grid square. Below is an example:
A potential issue that I think might arise, that this simulation uncovered is that our robot cannot be too small. If the grid size becomes much larger than a 20×20 grid, an inefficient algorithm could end up taking the robot a significant amount of time to traverse the entire room. Hence, I think we had originally thought of a smaller form size, but now I think we should pick a slightly bigger size.
My progress is currently on schedule. I think I just need to implement intermediary path finding between each square to find the most efficient way to visit unvisited squares, while still keeping much of the coverage algorithm the same.
I hope to develop a more efficient path coverage algorithm by next week and succeed in making a room map out of the point cloud from our LiDAR.
Kevin’s Status Report for Oct 20.2024
This week, we got the parts needed for the Jetson (sd card + internet adapter). We got the the Jetson to finally turn on as we found out that the monitor at my house was not the right dimensions (?), and decided to try it out on the ece lab clusters. We did however run into an issue where the UEFI version was not what was expected. We were expecting it to be 36.xx but it was 3.xx. As a result of this, I have reflashed the SD card with an older version of Jetpack and am currently in progress re-installing ross on our Jetson.
In terms of schedule, if I am able to successfully get some path searching algorithm done, we should be on schedule. Hopefully we can get the LiDAR completely up and running.
Kevin’s Status Report for Oct 6. 2024
This week, we got the LiDAR and we are planning to meet up tomorrow to test it. I spent a lot of time looking into existing implementations with the LiDAR that is in the catalog and have found a couple of youtube videos that we could reference while building our product.
In terms of schedule, we are slowly catching back up so it’s all good.
Kevin’s Status Report for Sep 29. 2024
This week, I was sick and had to make a business trip to India (currently on the flight back), and therefore was unable to make much progress.
In terms of schedule, we are falling behind our Gantt chart, but I believe we can easily make back the progress if we really focus next week. We’ve made a purchase order for a 2D Lidar, which means I’ll be able to start playing with it next week and hopefully get a working piece of software up by next weekend.
Kevin’s Status Report for Sep 21. 2024
This week, I mainly looked more in-depth into the potential LiDAR libraries that we can use that are compatible with the LiDAR that we are looking at (UniTree L1 LiDAR).
Here is the link to the repo that I have found: https://github.com/unitreerobotics/point_lio_unilidar?tab=readme-ov-file
I’ve played around with the sample point cloud data in the repository and discussed with Matthew about using IMU’s on our robot since that seems to be something that could massively enhance the efficacy of the LiDAR.
In terms of progress, I believe that we are on schedule. I need to get a working sample by tomorrow so we can submit a purchase request for the LiDAR before Tuesday so that the purchase can be made. The main deliverable that I hope to complete in the next week is to get the LiDAR and get a rudimentary mapping of a small room.