Team Status Report for 5/8/2021

Accomplishments

  • We are filming the video together as a team
  • All our members had other final projects, so we met a few times, just to film and plan for next week.

Team Status Report for 5/1/2021

  • Current Risks and Mitigation
    • All of us have been to the lab every day this week and spent at least 20+ debugging localization with GPS with no success. We collected sensor data at various locations on campus, including every sidewalk of the CFA lawn, the roof of the East Campus Garage, and multiple sidewalks around the flag pole. The coordinates we see both from the GPS fix and the filtered UTM location are reasonable, but the robot has difficulty orienting itself. We have yet to isolate the cause. We are currently testing a mitigation strategy where we do localization without GPS, only using IMU + wheel encoder to localize.
  • Changes to System Design
    • We might not use GPS in our design, depending on further localization testing
    • Our ongoing battle with campus wifi continues, so we have decided to NOT have the robot connected to the internet during delivery. When the robot loses wifi connectivity, the core stops working even though the robot does not make external communication because the core is identified by the robot’s CMU-DEVICE IP address. This means that we cannot use our emergency operator features in most demos.
  • Schedule Changes
    • We are working to get our MVP running consistently, as we had several runs in the past where it successfully delivered, but does not currently work properly.
  • Progress Pictures/Videos

Advaith’s Status Report for 5/1/2021

  • Accomplishments for the week:
  • We  have been field testing this whole week, trying to get consistent deliveries across campus. This is our biggest problem now, the GPS sensor is erratic, which causes our heading to change, which makes our robot go into the grass and get worse localization.
    • Our biggest problem is localization. GPS has been inconsistent, but surprisingly has worked well a few times. I still think it is worth doing it, but we are working on some mitigation plans for a backup.
      • Jumps in the GPS cause the controls to go haywire. We mitigated this by moving things into a continuous odom frame, but we found little difference. we think the error lies elsewhere and need to go through recorded datasets to discern it.
      • Mitigation plan 1: try sidewalk following using some basic feedforward into the controls. Still yet to try this
      • We tried the roof of the East Campus Garage, and had some GPS errors there. We may try again after analyzing the data from that run.
    • Worked with michael on the data association, this works well with multiple pedestrians. The perception part of our project is done now.
  • Progress:
    • /not much progress from last week, we are still field testing and working out our kinks in our local planners
  • Next week’s deliverables:
    • Make sure the robot drives properly
      • Localization
      • mitigation plan: do everything in the odom frame and hardcode the map
      • Sidewalk following
    • Final slides
    • Final video
    • Final report

Michael’s Status Report for 5/1/2021

  • Accomplishments for the week:
    • Wrote a data association program which can track multiple pedestrians at once with Advaith.
    • Wrote a simple P controller which can go to waypoints using its heading. It worked relatively well for one day,  before the robot started randomly driving off into the grass away from the waypoint more and more often. We are working on debugging this right now.
    • Tested the robot with the team.
  • Progress:
    • We’re focusing on mitigation.
  • Next week’s deliverables:
    • Final presentation
    • The robot should be able to drive either using GPS localization or with pure distance-based odometry.

Sebastian’s Status Report for 5/1/2021

  • Accomplishments
    • Wrote local planner subgoal interpolation
    • Increased the GPS fix covariance in an attempt to smooth our Kalman filter output; we think our IMU should be relied on more heavily than the GPS since it has less jitter
    • Ran many trials of the robot in various locations on campus collecting sensor data with the team.
    • No description available.
      • Example of some localization data collected. The red line is our filtered localization, and the blue line is the raw GPS fix.
  • Progress Timeline
    • Our progress has not regressed
  • Next Week Deliverables
    • Functioning localization
    • Final Presentation
    • A delivery robot

Michael’s Status Report for 4/24/21

  • Accomplishments for the week:
    • Over the past week, I worked with the team to fix the robot localization integration with GPS, IMU, and wheel odometry. Much of this involved field testing and recording rosbags of sensor data and scouring the ROS forums for clues as to why the global odometry and local odometry didn’t line up.
    • I generalized the global planner to take in text files of UTM offsets for nodes and edges between the nodes and build a graph based off of those files, and then output the UTM offsets for the waypoints. This allows the local planner to directly get waypoints in the UTM frame.
    • I worked with Advaith on pedestrian state estimation
  • Progress:
    • Currently I am on-track.
  • Next week’s deliverables:
    • Fix the localization once and for all
    • Integrate the global planner with the working system
    • Test the robot against our initial metrics

Team Status Report for 4/24/2021

  • Current Risks and Mitigation
    • We have spent a long time debugging the localization using GPS. One risk is that these issues never get resolved. We will be field testing the localization within the next day. If it doesn’t work with the current setup, we will switch to a single kalman filter taking in all inputs and localize within the UTM frame.
    • Another risk is that the perception system may be pushed out due to the delay in localization. Most of the code has been written except for the data association part, which shouldn’t take too long.
  • Changes to System Design
    • After evaluating ORBSLAM2, we decided to use the GPS+IMU+wheel encoder localization method. The reasoning is that the non-visual method is getting good enough results and we do not think that it is worth the time needed to integrate another odometry source.
  • Schedule Changes
    • We have a lot of work to squeeze into this last week.
      • Data association on pedestrians
      • Verifying localization
      • Testing the local planner with the new localization
      • All the required metrics testing
  • Progress Pictures/Videos
    • Some of the issues we are facing with a two-level state estimation system. Sometimes the local/global odometry gets misaligned (green/red – global/local odometry, blue – GPS odometry).
      • We fixed this by verifying the data inputs and fixing magnetic declination and yaw offset parameters
    • This is a better run, as you can see, the raw GPS data aligns well with all the odometry integrated. We hope to get another validation run like this and perhaps log waypoints.
    • Here’s a video of a constant velocity kalman filter tracker that we will use on pedestrians: MATLAB KF Tracker
    • Here’s a video of the same algorithm tracking me in 3D (red is filtered, blue is raw detections): Tracking in 3D Real World

Advaith’s Status Report for 4/24/21

  • Accomplishments for the week:
    • Integrated Kalman filter to work in 3D and tested on live data: Kalman 3D
    • Debugged the GPS localization on campus and took datasets for analysis
    • Below left is the GPS readings integrated with IMU/wheel odometry. On right is the raw GPS data. This is only one trial but we need to retry this when weather permits.
    • Modified the local planner with a new get_next_subgoal function
  • Progress:
    • We have a lot of integration to do during this last week
    • We need to get the localization up and running soon and perform a few basic waypoints on campus
    • We must pray to the robot gods for a smooth finish for this project
  • Next week’s deliverables:
    • Robot drives itself with global waypoints
    • Pedestrian data association and tracking
    • Integrate robot state machine and local planner