All of us have been to the lab every day this week and spent at least 20+ debugging localization with GPS with no success. We collected sensor data at various locations on campus, including every sidewalk of the CFA lawn, the roof of the East Campus Garage, and multiple sidewalks around the flag pole. The coordinates we see both from the GPS fix and the filtered UTM location are reasonable, but the robot has difficulty orienting itself. We have yet to isolate the cause. We are currently testing a mitigation strategy where we do localization without GPS, only using IMU + wheel encoder to localize.
Changes to System Design
We might not use GPS in our design, depending on further localization testing
Our ongoing battle with campus wifi continues, so we have decided to NOT have the robot connected to the internet during delivery. When the robot loses wifi connectivity, the core stops working even though the robot does not make external communication because the core is identified by the robot’s CMU-DEVICE IP address. This means that we cannot use our emergency operator features in most demos.
Schedule Changes
We are working to get our MVP running consistently, as we had several runs in the past where it successfully delivered, but does not currently work properly.
We have spent a long time debugging the localization using GPS. One risk is that these issues never get resolved. We will be field testing the localization within the next day. If it doesn’t work with the current setup, we will switch to a single kalman filter taking in all inputs and localize within the UTM frame.
Another risk is that the perception system may be pushed out due to the delay in localization. Most of the code has been written except for the data association part, which shouldn’t take too long.
Changes to System Design
After evaluating ORBSLAM2, we decided to use the GPS+IMU+wheel encoder localization method. The reasoning is that the non-visual method is getting good enough results and we do not think that it is worth the time needed to integrate another odometry source.
Schedule Changes
We have a lot of work to squeeze into this last week.
Data association on pedestrians
Verifying localization
Testing the local planner with the new localization
All the required metrics testing
Progress Pictures/Videos
Some of the issues we are facing with a two-level state estimation system. Sometimes the local/global odometry gets misaligned (green/red – global/local odometry, blue – GPS odometry).
We fixed this by verifying the data inputs and fixing magnetic declination and yaw offset parameters
This is a better run, as you can see, the raw GPS data aligns well with all the odometry integrated. We hope to get another validation run like this and perhaps log waypoints.
Here’s a video of a constant velocity kalman filter tracker that we will use on pedestrians: MATLAB KF Tracker
Here’s a video of the same algorithm tracking me in 3D (red is filtered, blue is raw detections): Tracking in 3D Real World
RTABMAP has poor performance. It is jittery and loses tracking very easily. Our mitigation strategy is to switch to ORBSLAM, and if that does not work then we will do localization without vision (using IMU, GPS, and wheel encoders, yet to be tested outdoors).
Changes to System Design
We may be transitioning to GPS+IMU+wheel encoder localization method since visual methods lose tracking easily. Sebastian will evaluate ORBSLAM to determine if it is a good replacement.
Schedule Changes
We were slightly delayed by our mistake with our motor controller last week. We discovered the Roboclaw has a LiPo voltage cutoff set to 21V, and we set it back to the 3 cell range. Now we have an extra Roboclaw (backup purposes only ;))!
We have focused on getting basic local planner functionality and finishing the robot state machine and groundstation state machine interfaces. Open items include pedestrian tracking, pedestrian avoidance, and mapping.
We are confident this can be done in 3 weeks with proper teamwork and execution. The difficult part of hardware integration is done, and we can focus on abstracted software.
Progress Pictures/Videos
We did small scale testing that demonstrated our robot’s ability to localize and provide controls to reach its waypoints.
We got hacked on 3/27/2021 – most likely due to improper ssh configuration and usage. As a result, we had to wipe the Xavier, set up our environment, and reinstall all our dependencies which took a significant amount of time.
Current Risks and Mitigation
Because of the lack of on-campus wifi connectivity, we’ve decided to move the Roscore to run on-board the robot so it can run autonomously even when the ground station gets temporarily disconnected.
Worst case, we can narrow down our delivery area to a selected subset of campus roads that are flat and have consistent wifi connectivity.
We found the source of the Jetson Xavier reboot bug that we believed to be related to the RealSense D435i. It was actually an issue with a particular USB-C adapter that we were using. We have other adapters that work without issue and will use those.
Current Risks and Mitigation
We found that the CMU WiFi is not reliable enough for a mobile robot to use the network outside (latency and disconnect issues). We are going to place a hotspot onboard the robot and use a different ROS network design.
RTABMAP can lose tracking if it sees mostly sidewalk, as there aren’t many features, so we will have to adjust the camera to ensure that RTABMAP can track consistently. Worst case, we use only GPS to localize the robot.
Changes to System Design
We have switched to larger caster wheels to fit the new mechanical design
We’ll be changing the network design from the CMU campus WiFi.
We received many parts this week, and we spent the majority of time testing them individually. We tested the motors, motor controller, robot/groundstation networking, and collision avoidance algorithm in simulation.
Current Risks and Mitigation
The Jetson Xavier is having issues with the RealSense D435i. It will reboot after running for what seems like an arbitrary amount of time. Additionally, the framerate is still lower than expected, likely due to using a default configuration.
We’ve seen other people use the same camera running RTABMAP at reasonable framerates, so we are going through forums looking for solutions
The motor controller disconnects easily when using USB connection for serial.
To fix this, we are moving towards a wired UART connection to the motor controller
Changes to System Design
Interface between Xavier and motor controller changes from USB Serial to UART.
This week was spent choosing parts and finalizing the design
Current Risks and Mitigation
We iterated on the design several times, and decided that choosing motors that could carry the 2kg payload will require higher end motor controllers and would violate the budget.
We decided to relax the requirement to travel on slopes on campus, and focused on flat areas only. This way, we could bump down our motor specs and meet the budget constraint.
Torque/motor = 0.5*Fr = 0.5*mgkr = 6.585 kg*cm
RPM = (60*v) / (π*2r) = 139.2 rpm
After finishing an initial design of a custom chassis, we decided to keep it simple and use a premade chassis. This will save a bit of money and save a lot of time that can be spent polishing our software.
Our TA brought up that visual SLAM may not work well outdoors. We mitigated this by adding an LED ring light to our design. (will be presented in design presentation)
Changes to System Design
We’ve decided on using a premade chassis.
Our campus travel requirement has been relaxed, specified in the above section.
Schedule Changes
No changes to our schedule this week.
Progress Pictures
This is Advaith’s CAD rendering of the new robot model using a premade prowler chassis, a wooden platform and a plastic holding device.
In our initial testing, running SLAM on the Xavier using the ZED mini had low fps (10). Advaith found that many people have this problem using the default ROS wrapper on the Xavier and had to make modifications.
We are at risk of going over-budget due to the high cost of high power motors and controllers, and our custom frame. Our mitigation strategy is to look into prebuilt bases and potentially relaxing our requirements to allow for less powerful hardware.
We did calculations to determine our motor and battery requirements, but the calculations use some assumptions. We use a safety factor of 1.2, but in the case that this was not enough overhead, we would mitigate by relaxing our speed, weight, and battery duration requirements
Changes to System Design
Decided on depth cameras over lidar for vision due to budget constraints; we have an Intel RealSense already.
There is a risk that the WiFi on campus will not be consistent enough for our latency requirements. One contingency plan we have is to place a cellphone with a hotspot onboard the robot if the campus WiFi is not satisfactory. But this likely has implications for our ROS network since it would traditionally run in a local area network.
Another risk is that localization algorithms may not work on campus. A mitigation plan is to test building a map on campus using an Intel Realsense before ordering parts, so we can finalize the correct sensor modalities. Contingency plan is to use fiducial markers.
Changes to System Design
Changed our design from 2 motors per driver to 1 motor per driver based on feedback from Prof. Kim. Using multiple motors on a driver would make accounting for any minor differences between motors very difficult.
Decided on WiFi over LTE as our target for low-level communication protocol based on the suggestion from Prof. Kim that it is simpler to integrate into our robot.