All of us have been to the lab every day this week and spent at least 20+ debugging localization with GPS with no success. We collected sensor data at various locations on campus, including every sidewalk of the CFA lawn, the roof of the East Campus Garage, and multiple sidewalks around the flag pole. The coordinates we see both from the GPS fix and the filtered UTM location are reasonable, but the robot has difficulty orienting itself. We have yet to isolate the cause. We are currently testing a mitigation strategy where we do localization without GPS, only using IMU + wheel encoder to localize.
Changes to System Design
We might not use GPS in our design, depending on further localization testing
Our ongoing battle with campus wifi continues, so we have decided to NOT have the robot connected to the internet during delivery. When the robot loses wifi connectivity, the core stops working even though the robot does not make external communication because the core is identified by the robot’s CMU-DEVICE IP address. This means that we cannot use our emergency operator features in most demos.
Schedule Changes
We are working to get our MVP running consistently, as we had several runs in the past where it successfully delivered, but does not currently work properly.
Increased the GPS fix covariance in an attempt to smooth our Kalman filter output; we think our IMU should be relied on more heavily than the GPS since it has less jitter
Ran many trials of the robot in various locations on campus collecting sensor data with the team.
Example of some localization data collected. The red line is our filtered localization, and the blue line is the raw GPS fix.
We have spent a long time debugging the localization using GPS. One risk is that these issues never get resolved. We will be field testing the localization within the next day. If it doesn’t work with the current setup, we will switch to a single kalman filter taking in all inputs and localize within the UTM frame.
Another risk is that the perception system may be pushed out due to the delay in localization. Most of the code has been written except for the data association part, which shouldn’t take too long.
Changes to System Design
After evaluating ORBSLAM2, we decided to use the GPS+IMU+wheel encoder localization method. The reasoning is that the non-visual method is getting good enough results and we do not think that it is worth the time needed to integrate another odometry source.
Schedule Changes
We have a lot of work to squeeze into this last week.
Data association on pedestrians
Verifying localization
Testing the local planner with the new localization
All the required metrics testing
Progress Pictures/Videos
Some of the issues we are facing with a two-level state estimation system. Sometimes the local/global odometry gets misaligned (green/red – global/local odometry, blue – GPS odometry).
We fixed this by verifying the data inputs and fixing magnetic declination and yaw offset parameters
This is a better run, as you can see, the raw GPS data aligns well with all the odometry integrated. We hope to get another validation run like this and perhaps log waypoints.
Here’s a video of a constant velocity kalman filter tracker that we will use on pedestrians: MATLAB KF Tracker
Here’s a video of the same algorithm tracking me in 3D (red is filtered, blue is raw detections): Tracking in 3D Real World
Successfully ran ORBSLAM2 on the Xavier. While the performance was better than RTABMAP, we decided that it was not worth the time to integrate it with the rest of our localization sensors since we are getting good enough localization results already.
Wrote the robot heartbeat node
Progress Timeline
Our schedule is tight but we’re confident that we will finish this week
Next Week Deliverables
Finish integration of the high-level state machines with the robot local planner
Assist with integration of collision avoidance
Benchmarks of the final system
Create the final presentation slides with my team and practice presenting
RTABMAP has poor performance. It is jittery and loses tracking very easily. Our mitigation strategy is to switch to ORBSLAM, and if that does not work then we will do localization without vision (using IMU, GPS, and wheel encoders, yet to be tested outdoors).
Changes to System Design
We may be transitioning to GPS+IMU+wheel encoder localization method since visual methods lose tracking easily. Sebastian will evaluate ORBSLAM to determine if it is a good replacement.
Schedule Changes
We were slightly delayed by our mistake with our motor controller last week. We discovered the Roboclaw has a LiPo voltage cutoff set to 21V, and we set it back to the 3 cell range. Now we have an extra Roboclaw (backup purposes only ;))!
We have focused on getting basic local planner functionality and finishing the robot state machine and groundstation state machine interfaces. Open items include pedestrian tracking, pedestrian avoidance, and mapping.
We are confident this can be done in 3 weeks with proper teamwork and execution. The difficult part of hardware integration is done, and we can focus on abstracted software.
Progress Pictures/Videos
We did small scale testing that demonstrated our robot’s ability to localize and provide controls to reach its waypoints.
Created a test of both the robot state machine and ground station order planner running on their respective machines. Fixed minor bugs revealed by the test.
Finished setting up ssh keys on our machines for the Xavier and disabled password authentication. The Xavier is now much less likely to get hacked.
Worked in the lab with my team to finish setting up the robot for the midterm demo, which involved taking various mechanical measurements to define coordinate transforms, as well as testing the robot in various places on campus to profile the SLAM performance.
Worked in the lab with my team to begin integration of the IMU and GPS to the localization code.
Progress Timeline
Our progress is on schedule
Next Week Deliverables
Begin integrating the robot planning code with the higher-level robot state machine to allow the ground station to send orders to the robot planner.
Attempt to switch from RTABMAP to ORBSLAM2 for faster mapping (RTABMAP loses tracking well below our desired speeds). If this does not improve SLAM performance, then we will be doing SLAM without vision (still using IMU, wheel encoders, and GPS).
Write the robot heartbeat node to aid in the debugging of integration tasks.
Localization was not functional for this trial. We were testing different speeds and soon after this trial found that the chosen speed was too fast for RTABMAP
We got hacked on 3/27/2021 – most likely due to improper ssh configuration and usage. As a result, we had to wipe the Xavier, set up our environment, and reinstall all our dependencies which took a significant amount of time.
Current Risks and Mitigation
Because of the lack of on-campus wifi connectivity, we’ve decided to move the Roscore to run on-board the robot so it can run autonomously even when the ground station gets temporarily disconnected.
Worst case, we can narrow down our delivery area to a selected subset of campus roads that are flat and have consistent wifi connectivity.
Set up ssh keys for the Xavier on 2/3 of our laptops. Once set up on all 3, we will disable password authentication for ssh so we don’t get hacked again.
Worked with Advaith to get the robot driving in some spots on campus with a human operator
Determined that we cannot change our network architecture since a hotspot does not work with ROS. Our intended solution is to have our robot be the network core instead of the ground station, and allow the robot to continue delivering during short network disconnects.
Progress Timeline
Our progress is on schedule
Next Week Deliverables
Write integration test between the robot state machine and the ground station
Implement the robot heartbeat package – need to synchronize sensor readings coming in at different rates
Assist with path planning
Progress Videos (WordPress not allowing video files)
We found the source of the Jetson Xavier reboot bug that we believed to be related to the RealSense D435i. It was actually an issue with a particular USB-C adapter that we were using. We have other adapters that work without issue and will use those.
Current Risks and Mitigation
We found that the CMU WiFi is not reliable enough for a mobile robot to use the network outside (latency and disconnect issues). We are going to place a hotspot onboard the robot and use a different ROS network design.
RTABMAP can lose tracking if it sees mostly sidewalk, as there aren’t many features, so we will have to adjust the camera to ensure that RTABMAP can track consistently. Worst case, we use only GPS to localize the robot.
Changes to System Design
We have switched to larger caster wheels to fit the new mechanical design
We’ll be changing the network design from the CMU campus WiFi.