Keshav Sangam’s Status Report for 3/27/2022

This week I worked on SLAM for the robot. On ROS Melodic, I installed the Hector SLAM package. There are a few problems with testing its efficacy; notably, the map building relies on the assumption that the LIDAR is held at a constant height while maneuvering the world. The next step is to build a mount on the Roomba for all the components so that we can actually test Hector SLAM. On top of this, I have looked into the IMU’s for the potential option of sensor fusion for localization. By using the iRobot’s wheel encoders and a 6-DoF IMU as the input into a fusion algorithm such as Extended Kalman Filtering (EKF) or Unscented Kalman Filtering (UKF), we could potentially massively increase the localization accuracy of the robot. However, things like bumps or valleys on the driving surface may cause localization errors to propogate through the SLAM algorithm due to the constant height assumption mentioned before. We will have to conduct tests on the current localization accuracy when we get the mount working in order to decide if the (E/U)KF+IMU is worth it.

Raymond Xiao’s status report for 03/27/22

This week, I helped set up more packages on the NVIDIA Jetson. We are currently using ROS Melodic and are working on installing OpenCV and Python so that we can use the pycreate2 module (https://github.com/MomsFriendlyRobotCompany/pycreate2) on the Jetson. We have been testing Python code on our local machines and it has been working well, but for the final system, the code will need to be run on a Jetson which has a different OS and dependencies. We ran into some issues with installing everything (we specifically need OpenCV 4.2.0) and are currently working on fixing this. OpenCV 4.2.0 is required since it contains the AruCO package which will be used in place of humans. We are using this form of identification since it is fast, rotation invariant, and has camera pose estimation information as well. I think I am track with my work. For next week, I plan to help finish installing everything we need for the Jetson so that we have a stable environment and can start software development in earnest.

Team Status Report for 3/19

As a team, we feel like we have made good progress this past week (since the iRobot Create 2 arrived). We were able to upload simple code to the iRobot and have it move forwards and backwards with some rotation as well. Additionally, we were also able to get SLAM uploaded and working well. Here’s a video:

This coming week, we plan on getting the Kinect to interface with the NVIDIA Jetson Xavier. We have decided to use the Kinect for object recognition instead of using the bluetooth beacons for path planning. We aim to have this set up and verify that it works by the end of next week as well.

Jai Madisetty’s Status Report for 3/19

This week, our main goal was to get all three main components interfacing with each other (iRobot Create 2, LIDAR sensor, and NVIDIA Jetson Xavier). My main focus was working with Keshav to get SLAM working on the NVIDIA Jetson Xavier. We were able to get the LIDAR sensor interfacing with the Jetson, and we then used the Jetson with LIDAR to create a partial mapping of one of the lab benches. We still have to do some more testing, but it more or less seems like it is working properly. Scheduling-wise, I may be a bit behind as I should be close to done with path-planning. This week, I plan to work with the Kinect to get path-planning working with CV and object recognition.

Raymond Xiao’s status report for 03/19/2022

This week, I focused on reading the iRobot Create 2 specification manual that gives more information on how to program the robot. I also looked more into a GitHub repository that has a package specifically for interfacing with the Create 2 (link: https://pypi.org/project/pycreate2/) through USB. I wrote some test code using this library on my local machine and managed to get the robot to move in a simple sequence. For example, to move the robot forward, a series of four data bytes are sent. The first two bytes specify the velocity while the last two bytes specify the turning radius. To drive forward at 100mm/s, send 100 as two bytes and then send 0x7FFF as the radius bytes.  In terms of scheduling, I think I am on track. For this upcoming week, I plan to have code that will be able to also interact with the Roomba’s sensors since that will be necessary for other parts of the design (such as path planning).

Jai Madisetty’s Status Report for 2/26

This week, we mainly worked to finalize some design specs and confirm what components we do and do not need. We’ve decided to use CV to detect targets placed all over our test environment. We scratched our idea of using sensors to detect bluetooth beacons as we are not dealing with smoke filled environments anymore.

In addition, when trying to interface the iRobot with the Jetson, the iRobot gave us an error signal and did not power on. As a result, we ordered a newer version of the iRobot Create 2, and we’re waiting on this before we get started with the programming.