Integrated the local planner to test basic waypoint functionality
Took measurements to determine sensor placement/added transforms for localization
Progress:
We have accomplished our interim demo goal of getting the robot to follow a waypoint.
We have extended this to having the robot follow multiple waypoints in different shapes/orders. This is a small scale test of following waypoints on campus.
Next we must have it localize within the map and execute multiple waypoints.
Over the past week, I mainly worked with the team to integrate all the submodules on the robot so that it can autonomously navigate through a specified series of waypoints for the interim demo.
Much of my work was done writing to the launch files and making sure the nodes correctly interfaced.
I also added a feature to the GUI that allows the emergency operator to log the robot’s current location in (x,y) offset (meters x meters) from the starting point and display it as a marker on rviz.
Progress:
Currently I am on-track.
Next week’s deliverables:
Bound the multiorder algorithm’s batch size since it’s exponentially complex
Work with team to integrate GPS and ORB-SLAM as a replacement for RTABMAP
RTABMAP has poor performance. It is jittery and loses tracking very easily. Our mitigation strategy is to switch to ORBSLAM, and if that does not work then we will do localization without vision (using IMU, GPS, and wheel encoders, yet to be tested outdoors).
Changes to System Design
We may be transitioning to GPS+IMU+wheel encoder localization method since visual methods lose tracking easily. Sebastian will evaluate ORBSLAM to determine if it is a good replacement.
Schedule Changes
We were slightly delayed by our mistake with our motor controller last week. We discovered the Roboclaw has a LiPo voltage cutoff set to 21V, and we set it back to the 3 cell range. Now we have an extra Roboclaw (backup purposes only ;))!
We have focused on getting basic local planner functionality and finishing the robot state machine and groundstation state machine interfaces. Open items include pedestrian tracking, pedestrian avoidance, and mapping.
We are confident this can be done in 3 weeks with proper teamwork and execution. The difficult part of hardware integration is done, and we can focus on abstracted software.
Progress Pictures/Videos
We did small scale testing that demonstrated our robot’s ability to localize and provide controls to reach its waypoints.
Created a test of both the robot state machine and ground station order planner running on their respective machines. Fixed minor bugs revealed by the test.
Finished setting up ssh keys on our machines for the Xavier and disabled password authentication. The Xavier is now much less likely to get hacked.
Worked in the lab with my team to finish setting up the robot for the midterm demo, which involved taking various mechanical measurements to define coordinate transforms, as well as testing the robot in various places on campus to profile the SLAM performance.
Worked in the lab with my team to begin integration of the IMU and GPS to the localization code.
Progress Timeline
Our progress is on schedule
Next Week Deliverables
Begin integrating the robot planning code with the higher-level robot state machine to allow the ground station to send orders to the robot planner.
Attempt to switch from RTABMAP to ORBSLAM2 for faster mapping (RTABMAP loses tracking well below our desired speeds). If this does not improve SLAM performance, then we will be doing SLAM without vision (still using IMU, wheel encoders, and GPS).
Write the robot heartbeat node to aid in the debugging of integration tasks.
Localization was not functional for this trial. We were testing different speeds and soon after this trial found that the chosen speed was too fast for RTABMAP
We got hacked on 3/27/2021 – most likely due to improper ssh configuration and usage. As a result, we had to wipe the Xavier, set up our environment, and reinstall all our dependencies which took a significant amount of time.
Current Risks and Mitigation
Because of the lack of on-campus wifi connectivity, we’ve decided to move the Roscore to run on-board the robot so it can run autonomously even when the ground station gets temporarily disconnected.
Worst case, we can narrow down our delivery area to a selected subset of campus roads that are flat and have consistent wifi connectivity.
Over the past week, I primarily worked to recover from the hack, and work with the team to reflash the Xavier and reinstall dependencies onto it.
I consolidated the custom messages between the ground station and robot FSM into one custom message ROS package for all components to use in order to make integration much easier.
I worked on the GUI based on my teammates’ usage feedback, combining the operator and user GUI into one.
My teammates ran the operator GUI and used it to remote control the robot successfully, showing that the GUI, joystick code and camera feed work in production.
Progress:
Currently I am on-track.
Next week’s deliverables:
Double-check the ground station FSM based on my teammates’ robot FSM code/tests
Help test the robot and help with mapping/vision, now that it can drive around.
Set up ssh keys for the Xavier on 2/3 of our laptops. Once set up on all 3, we will disable password authentication for ssh so we don’t get hacked again.
Worked with Advaith to get the robot driving in some spots on campus with a human operator
Determined that we cannot change our network architecture since a hotspot does not work with ROS. Our intended solution is to have our robot be the network core instead of the ground station, and allow the robot to continue delivering during short network disconnects.
Progress Timeline
Our progress is on schedule
Next Week Deliverables
Write integration test between the robot state machine and the ground station
Implement the robot heartbeat package – need to synchronize sensor readings coming in at different rates
Assist with path planning
Progress Videos (WordPress not allowing video files)
We found the source of the Jetson Xavier reboot bug that we believed to be related to the RealSense D435i. It was actually an issue with a particular USB-C adapter that we were using. We have other adapters that work without issue and will use those.
Current Risks and Mitigation
We found that the CMU WiFi is not reliable enough for a mobile robot to use the network outside (latency and disconnect issues). We are going to place a hotspot onboard the robot and use a different ROS network design.
RTABMAP can lose tracking if it sees mostly sidewalk, as there aren’t many features, so we will have to adjust the camera to ensure that RTABMAP can track consistently. Worst case, we use only GPS to localize the robot.
Changes to System Design
We have switched to larger caster wheels to fit the new mechanical design
We’ll be changing the network design from the CMU campus WiFi.
Got the Roboclaw running the motor and getting encoder values
Shortly after, the Roboclaw we had stopped working. I suspect that the USB port is messed up, or that we accidentally damaged it during our testing, but we weren’t able to connect to the software to update the firmware/debug either (so I suspect it is a USB port issue). We have ordered a new one arriving 3/31.
To prevent any further damage to the new roboclaw, we are following the instructions on the manual for operation strictly.
I finished the mechanical part of the robot, including sensor mounts, wiring,electronics mounting, etc.
Wrote a ROS node that can perform object detection using the YOLO CNN
Tested the RTABMAP algorithm outdoors using the robot’s compute. FPS runs fine, however, there are some issues with tracking when it viewed sidewalks with not many features. We believe this problem might get better if we include the IMU, GPS sensors as well. We will retest once we have the motor controller so we can drive around.
Schedule Progress
We are slightly delayed on our first milestone (joystick moving the robot) because of some issues with the Roboclaw motor controller. However, we tested on the previous one before it failed, so we are confident we can integrate it quickly.
Deliverables for Next Week
I wish to bring in all the sensors for state estimation, and make sure the visual slam works properly
Finish the pedestrian detection and tracking pipeline.