Status Report (5/04)

Group Update

Amukta’s Update

  • Worked on final report
  • Set up robot wireless visualization (integration with web app)
  • Worked on wireless tele-op commands on the RPi

Kanupriyaa’s Update

I focused on calculating what grid resolution gives the best mapping for the ICP. I also searched for pose graph optimization as we had mentioned in the design report to see if it was possible to make the ICP even better but the implementation time for this is too large to be completed before the final demo so I have decided not to pursue this. Amukta and me also met to test the robot wirelessly by SSHing into the Pi and stimulating the final demonstration.

Tiffany’s Update

  • Recalculated Twist message to motor commands conversions using updated wheel / robot dimensions, fixed some more bugs pertaining to robot turning / angular velocity conversions.
  • Redid setup for sshing into raspberry pi + X11 forwarding (I had to reimage my laptop recently due to software issues).
  • Integrated SLAM and motor control
  • Test teleoperating functions and tune allowed range for min/max linear and angular velocity on the UI for easier control.
  • Miscellaneous: worked on poster and final report, cleaned up CAD file to add screenshots to final report

Status Report (4/27)

Group Update

  • Overall, we are working on reducing the error in the two major components (encoder and SLAM), so that move_base can successfully move the robot.
  • We will be spending the first half of the upcoming week on the final presentation and poster.
  • Our next goal is to test with tele-op, and then integrate move_base so that the robot is autonomous.

Amukta’s Update

  • I converted the visualization web app to a grid (from the scatterplot), and changed its endpoints to work with the map the ICP algorithm produced.
  • ICP now produces a higher resolution map and has empty hallway spaces. The focus for next week will be to reduce the “drift” or accumulated error in the scans.
  • For next week, I want to connect the physical robot to the visualization app, so that we can test with tele-op (rather than wired).

Kanupriyaa’s Update

Since ICP got completed last week, I focused on increasing the time response and the precision of the grid mapping this week. The software now updates very fast in real time and the map can be adjusted depending on how big the square area of the place is. For next week I will work on making the ICP better by adding empty corridors to the map.

Tiffany’s Update

Hardware

  • Made changes to CAD of robot to achieve a stabler robot base
  • Lasercut components out of final material instead of the cardboard used for prototyping, and assembled the final robot
  • Soldered and rewired robot so that it’s now fully wireless (except for when we need keyboard, mouse, and monitor to view Rviz)

Software

  • Fixed bug with ROS parameters not being set properly, which had caused the odometry to be incorrect.
  • Attempted to tune PID parameters again by plotting expected velocity and odometry-detected velocity, but discovered that motor speed PID control was not working as intended because encoder reading ROS node is missing some encoder ticks. This is because the encoder ticks are being counted serially with the encoder values being published to ROS, which causes a delay. This causes the odometry-detected velocity to be capped based on the publishing rate, so tuning the I or D values of the PID control would cause the motors to always gradually increase speed towards 100% duty cycle.
  • Attempted to fix above problem by implementing “multi-threading” in my original python script, but this didn’t help much since Python does not have true multi-threading.
  • Fixed above problem by rewriting encoder reading python script in cpp

Tiffany’s Summary:

SLAM and robot base can now be operated remotely using one ssh window from my laptop.

Status Report (4/6)

Group Update

This week we had mid point demos, so the focus was mainly on integrating python scripts into ROS, and testing ROS topics. On the slam side, we were able to display static maps in RVIZ. There is still some localization error in the slam code so that moving the RPLidar will cause the entire map to shift across the screen instead of adding new parts of the map in the correct positions relative to the known areas of the map. On the motor control side, the onscreen joystick control, odometry, and motor control were integrated so we will be able to mount the RPLidar on the base to teleop the robot around for testing. For our midpoint demo, we showed these two separate parts of the system as separate demos since we have not integrated the two parts yet.

Amukta’s Update

I continued to work on ICP with Kanupriyaa.

Kanupriyaa’s Update

I was unwell this week and thus fell behind the schedule for some part. I was supposed to complete the testing for ICP and complete the mapping portion of the ICP done.  I am still working on both of these aspects and I believe I will need to do this for all of next week. I am hence one week behind schedule.

Tiffany’s Update

I wrapped up my motor control pipeline and migration to ROS, and tested the motor control ROS nodes (encoder input node, motor controller output node, motor speed PID / differential drive control node) on the robot and using Rviz. I also added an onscreen joystick control node from an existing ROS package so that I could demo and test the motor control to simulate motor commands that will come from the SLAM and path planning portions of the ROS network.