Over the past week, I worked with the team to fix the robot localization integration with GPS, IMU, and wheel odometry. Much of this involved field testing and recording rosbags of sensor data and scouring the ROS forums for clues as to why the global odometry and local odometry didn’t line up.
I generalized the global planner to take in text files of UTM offsets for nodes and edges between the nodes and build a graph based off of those files, and then output the UTM offsets for the waypoints. This allows the local planner to directly get waypoints in the UTM frame.
I worked with Advaith on pedestrian state estimation
Progress:
Currently I am on-track.
Next week’s deliverables:
Fix the localization once and for all
Integrate the global planner with the working system
We have spent a long time debugging the localization using GPS. One risk is that these issues never get resolved. We will be field testing the localization within the next day. If it doesn’t work with the current setup, we will switch to a single kalman filter taking in all inputs and localize within the UTM frame.
Another risk is that the perception system may be pushed out due to the delay in localization. Most of the code has been written except for the data association part, which shouldn’t take too long.
Changes to System Design
After evaluating ORBSLAM2, we decided to use the GPS+IMU+wheel encoder localization method. The reasoning is that the non-visual method is getting good enough results and we do not think that it is worth the time needed to integrate another odometry source.
Schedule Changes
We have a lot of work to squeeze into this last week.
Data association on pedestrians
Verifying localization
Testing the local planner with the new localization
All the required metrics testing
Progress Pictures/Videos
Some of the issues we are facing with a two-level state estimation system. Sometimes the local/global odometry gets misaligned (green/red – global/local odometry, blue – GPS odometry).
We fixed this by verifying the data inputs and fixing magnetic declination and yaw offset parameters
This is a better run, as you can see, the raw GPS data aligns well with all the odometry integrated. We hope to get another validation run like this and perhaps log waypoints.
Here’s a video of a constant velocity kalman filter tracker that we will use on pedestrians: MATLAB KF Tracker
Here’s a video of the same algorithm tracking me in 3D (red is filtered, blue is raw detections): Tracking in 3D Real World
Integrated Kalman filter to work in 3D and tested on live data: Kalman 3D
Debugged the GPS localization on campus and took datasets for analysis
Below left is the GPS readings integrated with IMU/wheel odometry. On right is the raw GPS data. This is only one trial but we need to retry this when weather permits.
Modified the local planner with a new get_next_subgoal function
Progress:
We have a lot of integration to do during this last week
We need to get the localization up and running soon and perform a few basic waypoints on campus
We must pray to the robot gods for a smooth finish for this project
Successfully ran ORBSLAM2 on the Xavier. While the performance was better than RTABMAP, we decided that it was not worth the time to integrate it with the rest of our localization sensors since we are getting good enough localization results already.
Wrote the robot heartbeat node
Progress Timeline
Our schedule is tight but we’re confident that we will finish this week
Next Week Deliverables
Finish integration of the high-level state machines with the robot local planner
Assist with integration of collision avoidance
Benchmarks of the final system
Create the final presentation slides with my team and practice presenting
Integrated the local planner to test basic waypoint functionality
Took measurements to determine sensor placement/added transforms for localization
Progress:
We have accomplished our interim demo goal of getting the robot to follow a waypoint.
We have extended this to having the robot follow multiple waypoints in different shapes/orders. This is a small scale test of following waypoints on campus.
Next we must have it localize within the map and execute multiple waypoints.
Over the past week, I mainly worked with the team to integrate all the submodules on the robot so that it can autonomously navigate through a specified series of waypoints for the interim demo.
Much of my work was done writing to the launch files and making sure the nodes correctly interfaced.
I also added a feature to the GUI that allows the emergency operator to log the robot’s current location in (x,y) offset (meters x meters) from the starting point and display it as a marker on rviz.
Progress:
Currently I am on-track.
Next week’s deliverables:
Bound the multiorder algorithm’s batch size since it’s exponentially complex
Work with team to integrate GPS and ORB-SLAM as a replacement for RTABMAP
RTABMAP has poor performance. It is jittery and loses tracking very easily. Our mitigation strategy is to switch to ORBSLAM, and if that does not work then we will do localization without vision (using IMU, GPS, and wheel encoders, yet to be tested outdoors).
Changes to System Design
We may be transitioning to GPS+IMU+wheel encoder localization method since visual methods lose tracking easily. Sebastian will evaluate ORBSLAM to determine if it is a good replacement.
Schedule Changes
We were slightly delayed by our mistake with our motor controller last week. We discovered the Roboclaw has a LiPo voltage cutoff set to 21V, and we set it back to the 3 cell range. Now we have an extra Roboclaw (backup purposes only ;))!
We have focused on getting basic local planner functionality and finishing the robot state machine and groundstation state machine interfaces. Open items include pedestrian tracking, pedestrian avoidance, and mapping.
We are confident this can be done in 3 weeks with proper teamwork and execution. The difficult part of hardware integration is done, and we can focus on abstracted software.
Progress Pictures/Videos
We did small scale testing that demonstrated our robot’s ability to localize and provide controls to reach its waypoints.
Created a test of both the robot state machine and ground station order planner running on their respective machines. Fixed minor bugs revealed by the test.
Finished setting up ssh keys on our machines for the Xavier and disabled password authentication. The Xavier is now much less likely to get hacked.
Worked in the lab with my team to finish setting up the robot for the midterm demo, which involved taking various mechanical measurements to define coordinate transforms, as well as testing the robot in various places on campus to profile the SLAM performance.
Worked in the lab with my team to begin integration of the IMU and GPS to the localization code.
Progress Timeline
Our progress is on schedule
Next Week Deliverables
Begin integrating the robot planning code with the higher-level robot state machine to allow the ground station to send orders to the robot planner.
Attempt to switch from RTABMAP to ORBSLAM2 for faster mapping (RTABMAP loses tracking well below our desired speeds). If this does not improve SLAM performance, then we will be doing SLAM without vision (still using IMU, wheel encoders, and GPS).
Write the robot heartbeat node to aid in the debugging of integration tasks.
Localization was not functional for this trial. We were testing different speeds and soon after this trial found that the chosen speed was too fast for RTABMAP
We got hacked on 3/27/2021 – most likely due to improper ssh configuration and usage. As a result, we had to wipe the Xavier, set up our environment, and reinstall all our dependencies which took a significant amount of time.
Current Risks and Mitigation
Because of the lack of on-campus wifi connectivity, we’ve decided to move the Roscore to run on-board the robot so it can run autonomously even when the ground station gets temporarily disconnected.
Worst case, we can narrow down our delivery area to a selected subset of campus roads that are flat and have consistent wifi connectivity.