Status Report (5/04)

Group Update

Amukta’s Update

  • Worked on final report
  • Set up robot wireless visualization (integration with web app)
  • Worked on wireless tele-op commands on the RPi

Kanupriyaa’s Update

I focused on calculating what grid resolution gives the best mapping for the ICP. I also searched for pose graph optimization as we had mentioned in the design report to see if it was possible to make the ICP even better but the implementation time for this is too large to be completed before the final demo so I have decided not to pursue this. Amukta and me also met to test the robot wirelessly by SSHing into the Pi and stimulating the final demonstration.

Tiffany’s Update

  • Recalculated Twist message to motor commands conversions using updated wheel / robot dimensions, fixed some more bugs pertaining to robot turning / angular velocity conversions.
  • Redid setup for sshing into raspberry pi + X11 forwarding (I had to reimage my laptop recently due to software issues).
  • Integrated SLAM and motor control
  • Test teleoperating functions and tune allowed range for min/max linear and angular velocity on the UI for easier control.
  • Miscellaneous: worked on poster and final report, cleaned up CAD file to add screenshots to final report

Status Report (4/27)

Group Update

  • Overall, we are working on reducing the error in the two major components (encoder and SLAM), so that move_base can successfully move the robot.
  • We will be spending the first half of the upcoming week on the final presentation and poster.
  • Our next goal is to test with tele-op, and then integrate move_base so that the robot is autonomous.

Amukta’s Update

  • I converted the visualization web app to a grid (from the scatterplot), and changed its endpoints to work with the map the ICP algorithm produced.
  • ICP now produces a higher resolution map and has empty hallway spaces. The focus for next week will be to reduce the “drift” or accumulated error in the scans.
  • For next week, I want to connect the physical robot to the visualization app, so that we can test with tele-op (rather than wired).

Kanupriyaa’s Update

Since ICP got completed last week, I focused on increasing the time response and the precision of the grid mapping this week. The software now updates very fast in real time and the map can be adjusted depending on how big the square area of the place is. For next week I will work on making the ICP better by adding empty corridors to the map.

Tiffany’s Update

Hardware

  • Made changes to CAD of robot to achieve a stabler robot base
  • Lasercut components out of final material instead of the cardboard used for prototyping, and assembled the final robot
  • Soldered and rewired robot so that it’s now fully wireless (except for when we need keyboard, mouse, and monitor to view Rviz)

Software

  • Fixed bug with ROS parameters not being set properly, which had caused the odometry to be incorrect.
  • Attempted to tune PID parameters again by plotting expected velocity and odometry-detected velocity, but discovered that motor speed PID control was not working as intended because encoder reading ROS node is missing some encoder ticks. This is because the encoder ticks are being counted serially with the encoder values being published to ROS, which causes a delay. This causes the odometry-detected velocity to be capped based on the publishing rate, so tuning the I or D values of the PID control would cause the motors to always gradually increase speed towards 100% duty cycle.
  • Attempted to fix above problem by implementing “multi-threading” in my original python script, but this didn’t help much since Python does not have true multi-threading.
  • Fixed above problem by rewriting encoder reading python script in cpp

Tiffany’s Summary:

SLAM and robot base can now be operated remotely using one ssh window from my laptop.

Status Report (4/20)

Group Update

The SLAM software, encoder code, and local planner are fully implemented, and now need to be tested all together in time for demo. Maze parts, along with an alpha version of the robot with all the parts together, were constructed.

Amukta’s Update

I helped Kanupriyaa finish implement ICP and mapping. I also constructed the maze parts needed for testing and the demo.

Kanupriyaa’s Update

The ICP was completed and tested with different maze features. The ICP works but has huge error accumulation which prevents an accurate map from being displayed. Going to work on making ICP more optimized this week.

Tiffany’s Update

  • Ordered parts arrived, including the aluminum channels and cardboard pieces for the maze, 5V to 12V step up adapter, DC barrel jack adapter, portable phone charger (our power source), and caster wheels.
  • I updated measurements for the CAD file and lasercut various cardboard prototypes for RPLidar mount, Raspberry Pi mount, and caster wheel mount. The last prototype from this week has issues with balance, so I will need to make the robot chassis longer. Also, the cardboard mounts for the caster wheels bend under stress, but I don’t anticipate this being an issue when I cut the final version out of compressed wood. I also put all of components on the robot to figure out a compact way to mount everything in the final wireless version.
  • I also bandsawed the 8 ft long pieces of aluminum channels down into 6 inch pieces, prototyped a few maze pieces and then assembled several more maze pieces with Amukta.
  • On the electrical side, I tested the 5V to 12V adapter to make sure it is outputting 12V. I ran into two problems with adding the adapter to the system. The adapter’s current limit is barely enough to power both motors at 60% speed, so I had to limit the max motor power. This does not impact the overall performance though, since the 100% motor speed (350 rpm) was much faster than our project requirements called for. Secondly, I ordered a male DC barrel jack adapter instead of the female adapter so I had to search campus for a female adapter. In the meantime, the robot is still being powered by the 12V wall power supply.

Status Report (4/13)

Group Update

We’re behind on ICP implementation and integrating that with the encoder data. Our focus in the next week is constructing the entire robot and doing the above.

Amukta’s Update

This week I worked on creating an RPLidar node that detected hallways (to be used in local planner). I also met with Kanupriyaa to work on ICP.

Kanupriyaa’s Update

I completed the ICP code this week and met with Amukta to connect with the LIDAR and debug the code.

Tiffany’s Update

This week I researched and ordered parts necessary to build the maze and some additional parts for the robot (caster wheels, power supply …etc). I also updated the CAD of the upper layer of the robot body for mounting the RPLidar to, so that the full robot, including the new caster wheels, can be assembled on Monday and used to measure the error rate of the encoder. I am in the process of debugging the transform frames outputted from the odometry data, since these tfs are incorrect right now.

Status Report (4/6)

Group Update

This week we had mid point demos, so the focus was mainly on integrating python scripts into ROS, and testing ROS topics. On the slam side, we were able to display static maps in RVIZ. There is still some localization error in the slam code so that moving the RPLidar will cause the entire map to shift across the screen instead of adding new parts of the map in the correct positions relative to the known areas of the map. On the motor control side, the onscreen joystick control, odometry, and motor control were integrated so we will be able to mount the RPLidar on the base to teleop the robot around for testing. For our midpoint demo, we showed these two separate parts of the system as separate demos since we have not integrated the two parts yet.

Amukta’s Update

I continued to work on ICP with Kanupriyaa.

Kanupriyaa’s Update

I was unwell this week and thus fell behind the schedule for some part. I was supposed to complete the testing for ICP and complete the mapping portion of the ICP done.  I am still working on both of these aspects and I believe I will need to do this for all of next week. I am hence one week behind schedule.

Tiffany’s Update

I wrapped up my motor control pipeline and migration to ROS, and tested the motor control ROS nodes (encoder input node, motor controller output node, motor speed PID / differential drive control node) on the robot and using Rviz. I also added an onscreen joystick control node from an existing ROS package so that I could demo and test the motor control to simulate motor commands that will come from the SLAM and path planning portions of the ROS network.

Status Report (3/30)

Group Update

  • This week, we’re preparing for the midpoint demo.
  • We’re slightly behind on integration, so we will be demoing the motor controllers and the ICP portions separately.

Amukta’s Update

  • Worked with Kanupriyaa on the ICP code.
  • Next week, I will visualize and prepare the code for demo, and write the mapping code.

Kanupriyaa’s Update

  • Finished writing the ICP code and integrated it with Amukta’s code.
  • Next week will be building and integrating with the mapping node and the UI.

Tiffany’s Update

  • Finished testing motor control and migrated into a ROS package.
  • Added subscriber to motor control for demonstrating motor control next Wednesday in class.
  • Set up ssh and git on raspberry pi (this took several hours as I kept running into configuration issues, since we weren’t using the default Raspbian image).
  • Discussed with team on how to implement global planning and data transmitted between SLAM and path planning. Figured out algorithm for global planner in detail, will start implementing this weekend. I’m now two weeks behind on the global planner, but hopefully with the algorithm settled implementation should not take too long.

Status Report (3/23)

Group Update

  • Overall, we are on track to have a moving robot and basic navigation by the first demo in about two weeks.

Amukta’s Update

  • I spent the week experimenting more with the RPLidar SDK and output data, and beginning to work on ICP. I am currently working on closest point approximation using point to plane scan matching.
  • Over the past two weeks, I finished up the visualization web client and server, which should clear up 5-10 days I set aside in April according to the Gantt chart. See the code here.
  • I am on-track, as I’ve cleared up some tasks ahead of time.

Kanupriyaa’s Update

  • I spent the week setting up the development for the ICP, the VMWare software and Ros for the development.
  • I made the algorithm for ICP and divided the work amongst Amukta and me for the development. Algorithm consists of minimization of ICP error function using point to plane matching.
  • Next week I will implement the minimization function and get the final ICP translation.

Tiffany’s Update

  • I spent a few days this week finding components, designing a CAD file of the robot using SolidWorks, and assembling a prototype that can be used for testing. The final robot will have an additional layer with the same shape for mounting the RPLidar, but I did not lasercut this piece yet since Amukta was working with the RPLidar this week.
  • Additionally, I wrote a ROS package to send motor commands to the motors. I also added distance/angle calculations for getting the robot from point A to B, and P(ID) control for the base and differential motor commands (left motor = base + differential; right motor = base – differential.
  • I am one week behind with writing the global path planner, due to my time spent previous weeks on researching path planning and writing the detailed design report.

 

Status Report (3/09)

Group Update

  • This week is short due to mid semester break and spring break (the Gantt chart accounts for this).

Amukta’s Update

  • I spent the week experimenting more with the RPLidar SDK and output data. This week was shorter than usual due to mid-semester break and spring break.
  • I will be working on the visualization UI over break, since I won’t need access to hardware to do so.
  • I am still on-track, since we did not schedule any work to be done over spring break.

Kanupriyaa’s Update

As I left on Wednesday evening (middle of the week) for Spring break I accounted for the time that I wouldn’t have this week and distributed it to other weeks.

Tiffany’s Update

  • This week I tested the motor’s stall current to make sure it’s within the Raspberry Pi motor hat limits (the stall current was not specified by the vendor), and tested the Adafruit motor control library. The motor is rated for 12V but is quite fast even at 5V, so I decided to just power the motor with the same 5V power source as the Raspberry Pi. Since this motor seems to work for our project, I ordered a second one.
  • Since I lost the micro SD card we were originally using for the Raspberry Pi last week, the motor control code could not be tested last week. We ordered a new SD card, which arrived this week, and I set up the Ubuntu + ROS environment and tested the motor controls.

Status Report (3/02)

Group Update

Slightly behind schedule for motor/encoder IO development on Raspberry Pi due to missing micro SD card, and slightly behind on implementation of SLAM due to work on design presentation and report.

 

Amukta’s Update

  • This week, I mostly worked on the design presentation and design report.
  • I set up the RPLidar and tested its input. I also set up an Ubuntu VM and ROS repo for the entire team to use: https://github.com/amnayak/slambot-ros

  • I continued to work on the visualization web app.
  • Next week, I will be working on implementing loop closure.

Kanupriyaa’s Update

ICP Algorithm Building

After we changed the entire system architecture and decided to use Odometry to build the data, I  stopped work on the data probability for localization and continued work for building the new SLAM Algorithm. I researched and read 26 papers on SLAM, ranging from ICP, Graph based SLAM to the different components that go into SLAM to stitch together a design we have made on our own. We have decide to just use ICP in a Graph based SLAM and later optimize the graph for better accuracy.

Tiffany’s Update

Maze Design & General Tasks

I brainstormed ways to design the maze to make it modular and decided on building the maze in 1x1x1 ft sections that can be reconfigured to form different mazes.

I also updated the Gantt chart to reflect changes to our implementation and added milestones.

Motor/Encoder IO ROS Nodes

The motor and encoder IO ROS nodes were suppose to have been completed this week. I have written ROS nodes for reading and writing to the motors and encoders, but cannot test these since I lost the micro SD card for the Raspberry Pi. So I am slightly behind on this task, but the nodes can be tested once the new SD card arrives on Monday and is set up.

Next Week

  • Test motor and encoder IO nodes
  • Start work on path planning (coordinate with Kanupriyaa on ROS topics)
  • Build maze segments

Status Report (2/23)

Group Update

Overall, the team is on schedule.

Amukta’s Update

  • I didn’t get the RPLidar until the end of the week, so I started setting up the visualization web app (python/django) in the mean time. I also looked into the RPLidar SDK and the RPi, and set them both up. RPLidar/ROS/Raspberry Pi has been done together many times, so I’ve been using those tutorials.
  • We decided to use ROS, so I looked into how ROS nodes work.
  • Next week, I’m going to begin testing the RPLidar with the RPi, and working towards processing the lidar’s points and creating data structures that are usable by the other ROS nodes/modules. I will also begin building the maze.
  • I’m still figuring out how the RPLidar points are going to be used by the other modules, but overall, I’m on schedule.

Kanupriyaa’s Update

ROS Learning:

I completed a ROS course to learn about the basic structure that will need to be set up and spent time building a software architecture diagram that shows the interactions between all the different interfaces for the design review.

Localization:

The localization EKF algorithm couldnt be completed due to addition of ROS. I have also decided to change to implement Nearest Neighbour ICP-EKF which will be a bit harder in terms of implementation and will require all of next week.

Next Week: 

Next week will be completing the implementation of ICP-EKF inside a ROS node and testing to make sure the node works.

Tiffany’s Update

ROS

This week, I focused on transitioning our design to using ROS environment after I realized ROS provides easier interfacing with the RPLidar, as well as existing autonomous navigation architectures that we could use as guidelines. I researched ROS implementation of SLAM and drew schematic of existing ROS nodes that can be used for a full SLAM navigation pipeline. Then, Kanupriyaa and I finalized the architecture for our customized implementation with Kanupriyaa. I also set up a github and ROS workspace so that the team can concurrently work on code and port to the Raspberry Pi easily.

Electrical

I researched methods for powering the 12V motor and 5V components (everything else). One option is to use two separate power sources. I decided on using a portable phone charger, which outputs 5V, is highly compatible for Raspberry Pi and can easily be replaced or recharged. To power the two 12V motors, we can use a 5V to 12V step up converter that minimizes power loss. Additionally, I finalized the electrical schematic (without camera).

Next Week

Due to a delay in ordering parts last week, I was unable to start assembling the robot this week. I need to test the motors and encoders running on an Arduino script first (since Amukta is working with the Raspberry Pi), design the chassis CAD and assemble the robot base. We will only have 1 motor and wheel now since we decided to test it out before buying more. I will write a preliminary ROS motor driver node and a reader for encoder data.