Found a bug with George, which was causing a transform from map to Odom frame to be published by two sources. This is incorrect as AMCL (localization) should be responsible for publishing this transform.
The above bug helped us fix initial pose estimation allowing for the robot to start in any location in the room (before it had to start at a specific spot described by SLAM).
Localization still drifts which is a cause for concern.
Lowered navigation control update loops. This was necessary as the hardware we are using is not powerful enough to handle updates that create smooth movement.
Setup Jetson AGX using usb wifi for ssh. Although it is a bit faster than the NX, when doing development over ssh, it often freezes. We have decided that a better development experience is more valuable at the moment and will continue using the NX.
Worked with George to implement a simple program that allows the Roomba to follow the movement of an ARUco tag.
Schedule
Behind on navigation. Need to use slack to finish.
Deliverables
Fine tune navigation
Testing Plans
Not ready for navigation testing, still a lot of fine tuning left. Some simple tests can be run: a battery test running the Roomba in a back and forth pattern at full throttle as well as running the servos for bin docking intermittently. Additionally we will be doing further testing of localization for navigation fine tuning.
We will do ARUco tag orientation to test simulated trash bin pickup. This will be accomplished using a tag attached to a wall.
Continued working on navigation bringup with George.
Kind of working. We were able to give a goal pose and the robot was able to follow the planned path.
A bit finicky, needs to start at a specific point for path planning to work. (Localization and path planning are desynced for some reason)
Doesn’t do well with all types of obstacles, needs some more testing and optimization
Worked on LaserScanMatcher to implement odometry data using just laser scans (we are not going to use LaserScanMatcher for now b/c we got odometry data from Roomba).
Building the Roomba library from source (which is a newer version than the one that can be downloaded on ubuntu repositories) and changing the baud rate has enabled the odom topic in Ros
It was working well enough to get simple path finding working
Need to do some more testing to see if it is fully working and accurate
Schedule
On schedule, fine tuning navigation
Deliverables
Check in with usb wifi adapter package shipping timeline (Quin was out)
Successfully created global planned path from robot to target location.
Setup gmapping as an alternative SLAM mapping engine. It resulted in a poorer quality map when compared to Hector SLAM.
Debugged faulty navigation. When attempting to follow a planned path, the robot does not make any progress, it just spins in circles. We have identified a possible error in odometry data.
The odom data provided by the Roomba seems to be very unreliable. Many people have mentioned this online. This may be causing navigation/gmapping issues as both rely on odom.
We have identified an alternative method of gathering odom data. Specifically using laser scans to emulate odom.
Schedule
On schedule, debugging navigation
Deliverables
Check in with usb wifi adapter package shipping timeline.
Ordered wifi module for Jetson Xavier AGX. This will allow remote control and monitoring.
While we wait for the wifi module to arrive, I setup the backup Jetson Xavier NX. The NX comes with a wifi module installed, so I was able to ssh into it over CMU-SECURE right away.
Installed ROS onto Xavier NX
Used the python modules pycreate2 + sshkeyboard to control the Roomba over ssh.
Installed Hector SLAM onto the Xavier NX
Used Hector SLAM to map HH1307 with team. When attempting to map nearby the windows of the room, the map became very inaccurate as it could not identify the window as a wall. We were using tethered driving to map, so we had to follow the robot closely, which may have impeded its mapping accuracy. Now that I have remote control working, we will attempt mapping again to see if the issue persists. I also want to test our alternative LiDAR sensor (Intel l515) to see if it has the same issue.
Completed ethics assignment.
Schedule
Did not start navigation, as I focused on getting the NX setup, and getting remote movement working. I’m going to work on navigation this Sunday with the team to make up on time.
Deliverables
Setup wifi module on Xavier AGX once it is delivered
Initially got ssh working on windows, but after system reset, failed to connect
Using Linux 20.04 + Refreshing the Jetson (and not updating any packages) seems to have alleviated the issues, but just in case we have requested the Xavier NX as a backup
Matlab
Connected Matlab to Jetson
Compiled test Matlab Script -> Cpp executable remotely onto Jetson
Helped setup roomba driving
Schedule
Again behind on LIDAR because of continuous issues with the Jetson
Will have to use Spring Break to make up one week of work
Deliverables
Create a Matlab -> Cpp executable that takes lidar scan data (angle + distance vectors) and creates a map object
Got LIDAR driver setup on development machine. Ran test application to make sure that the LIDAR was working properly.
Finalized and practiced for the design presentation.
Got ArUco tag identification working with George.
Am I on schedule?: Because I focused on the presentation and setting up the Jetson, which took longer than expected (still having trouble getting it to connect over ethernet), I am behind on LIDAR implementation. If its not possible to troubleshoot, I will just buy a usb network adapter instead of using ethernet.
What deliverables do I want to complete next week?:
Compile MATLAB SLAM & Navigation Toolbox to the Jetson.
Get ArUco pose estimation working
Please list the particular ECE courses (if any) that covered the engineering science and mathematics principles your team used to develop your design?:
18-100: We the knowledge from 18-100 when calculating the required battery size our robot would need to meet our use case requirements.
The knowledge required for ArUco tag identification came from outside of school learning.
This week I have begun researching SLAM and related algorithms, specifically I am interested in combining lidar depth data with visual landmarks (April tags) to improve mapping.
Additionally, my group mates and I discussed the benefits/challenges of using a 360 deg. lidar system vs. a fixed lidar system. (It also may be possible to filter the 360 deg. lidar system?)
I helped fix our website to properly display weekly status report posts by category/team member.
Am I on schedule?: Yes, no direct tasks where assigned this week.
What deliverables do I want to complete next week?:
If I get access to the roomba next week, I would like to have basic movement control when tethered to a PC complete.
If I get access to the LIDAR camera early next week, I would like to attempt implementing SLAM using a premade library (Maybe MATLAB or slam_toolbox)
If I get access to a camera, my goal is to implement local bin identification.