Sean’s Status Update for 04/25 (Week 11)

Progress

This week, I focused mainly on producing presentation/demo material. I recorded couple videos of the robot performing path-finding as well as driving to the desired location. The project is more or less complete, and I am cleaning up the codes I wrote so it is more readable to anyone who sees the code for the first time.

Deliverables next week

Next week, I plan to record more video for the final presentation. In addition, I will work on the final report.

Schedule

On schedule.

Sean’s Status Update for 04/18 (Week 10)

Progress

Path-finding

Path-finding algorithm, along with the driving algorithm is pretty much complete. It is able to drive to the center of the cell the goal (x,y) position is located in. More testing would be necessary to see if there are any edge-cases, but the robot can now drive to the user/point given the xy-coordinate.

Deliverables next week

Next week, I plan to complete integrating the object/gesture detection work done by Jerry and Rama to test the functionalities. Hopefully, we will also have time to work on the video presentation and the report.

Schedule

On schedule.

Sean’s Status Update for 04/11 (Week 9)

Progress

Midpoint-Demo

This week, I presented the work I’ve done with the robot so far. The presentation included a video of the Roomba mapping the room and snaps of the generated maps. I spent some time recording the video and data so it is clear how the robot maps the room. Overall, I think the presentation went well.

Path-finding

I also spent some time developing the path-finding algorithm for the robot. It was helpful that the map already presents a discretized grid version of the space.  That way, we can fairly easily implement 8-point connectivity representation of the motion of the roomba. I plan to implement the A* (probably weighted A*) algorithm to traverse grids. However, this introduces another concern. The odometry is particularly susceptible to the turning of the roomba. If we use the 8-point connectivity, that means we need to carefully measure the turning angles. Hopefully the error isn’t big enough to accumulate to a significant value, but we will have to see.

Deliverables next week

Next week, I plan to complete the implementation of path-finding and the consequent driving.

Schedule

On schedule.

Sean’s Status Update for 04/04 (Week 8)

Progress

2D Mapping

This week, I finished developing the 2D mapping algorithm. It is a combination of edge-following and lawn-mowing, which effectively maps every possible position of the robot in the room. The robot records its XY-coordinates throughout the phase and at the end of the exploration, a map is generated based on the record. The algorithm itself isn’t overly complicated, but there were many real-world issues (unstable serial connection, inaccurate encoder values, different light/floor condition, etc. ) which made making the algorithm robust very difficult. As I mentioned before, doing edge-following in both direction is necessary to map any corner within the room. Similarly, it turns out that doing lawn-mowing pattern in both horizontal and vertical direction was also necessary. The total length of the mapping would take about 5 minutes for a small sized room. It was also necessary to carefully iterate through the resulting list of coordinates to generate a useful map. The map is basically a discretized grid representation of the room.  Currently, the cell-size of the map is set to 5cm. i.e., one cell in the map would represent 5cmx5cm square area of the room. If the robot was able to travel through the cell multiple times, it is safe to conclude the cell is safe, especially with small enough cell size. In addition, if an unexplored cell which is within the range(the radius of the robot) of two or more already explored safe cells, it is reasonable to conclude the unexplored cell is safe as well.

Driving to Home

I also implemented a simple version of driving-to-home function. It will record the “home” location at the beginning of the mapping. This location is different from the actual charging station. To utilize the “dock” functionality of the roomba, it has to be a certain distance straight away from the charging station; or else, the IR sensor will fail to locate the charging station, and the robot will begin travelling in a wrong direction. Thus, the robot will initially move straight back for a certain period of time and record that position as the “home,” so it can try to dock afterwards. Currently, the robot will simply turn to the “home” location and drive straight until it thinks it is within an tolerable error range with the goal. Obviously, it doesn’t take account for the map of the room or any obstacle it might face on the path, so I will improve this functionality in the coming week.

Deliverables next week

Next week, I will improve the driving-to-home functionality and implement a path-finding algorithm that can be used in the driving-to-user functionality.

Schedule

On schedule.

Sean’s Status Update for 03/28 (Week 7)

Progress

I finally got the Roomba as well as the webcam and connecting cables delivered to me.

Env-Setup & Data collection

I set up the environment in my basement, with the webcam covering the corner of the room. I tested basic manipulations (driving, rotating, etc.) and realized the odometry is a bit off. I suspect the carpet floor to be the cause. It is still within the requirement (1-ft error after 10m run) but I might need to add some fine tuning to improve the accuracy. I also was able to collect some video data for other teammates to use.

Mapping

Edge following part of the mapping is working fine (with some fine tuning tweaks due to the new environment). Initially, I planned to use multi-goal A* algorithm for mapping. However, it turns out it is unnecessarily complicated for a limited indoor environment. So I am pivoting to lawn-mowing pattern, which works well in a confined space with a few obstacles. The mapping will be done in the early next week.

Deliverables next week

I will complete the 2-D mapping with some limited implementation of path-finding.

Schedule

On schedule.

Sean’s Status Update for 03/21 (Week 5, 6)

Progress

This week, well, it was fairly unusual. I’m in Chicago with my parents and still trying to adjust to the new norm. That includes working on the robot without the actual robot. The Roomba will arrive here in approximately 1~2 weeks, so in the meantime, I am working on the simulator, Webots, to develop some basic versions of the algorithms. This is purely for the developing purpose, and the simulation will not appear on the demo or the final presentation.

Webots

Luckily, Webots, an open source simulator, comes with a nice model for iRobot Create. This lacks some sensors, such as light sensors or IR sensors, but it is good enough for the purpose of algorithm development. I was able to implement part of my previous 2D-mapping code into the simulator, and I plan to work on it for a while.

Deliverables next week

Next week, I will continue to work on Webots to implement 2D mapping algorithm and possibly start working on the path-planning.

Schedule

A bit behind; schedule updated

Sean’s Status Update for 03/07 (Week 4)

Progress

This week, I worked on implementing the mapping algorithm. Initially, I thought it would be an easy task given that 2D mapping simply implies scanning the room autonomously. However, it turns out it requires much more fine-tuning then that. There are 2 main phases to my mapping algorithm: edge-following and filling-in-the-rest.

Edge-Following

First, the robot performs “edge-following” to get the dimension of the room. The robot will first move forward until it detects an wall. Then, it rotate to the right(or left) in place until the wall is no longer visible. Once this is done, the robot will move forward in an arc steering to the left(or right). The reason for moving in an arc is to lead the robot toward the wall. If it moved straight instead, the robot will potentially travel away from the wall. The edge-following is “done” when the robot returns to its initial position where it detected the first wall. This task is done twice, once steering to the left and once to the right. Performing edge-following in both directions is necessary to scan the area that was potentially missed in one of the runs. Meanwhile, the XY-coordinate of the robot is being recorded to generate the map.

Filling in the Rest

Once the boundary of the room is set, the robot must travel the rest of the room to find any potential obstacles such as a furniture. This is a bit more tricky. First, it is hard to define the end-condition that guarantees enough samplings to generate a map. In addition, it is difficult to set the behavior of the robot that would work universally regardless of the shape of the room. It turns out that iRobot actually programmed Roomba to move “randomly”–move straight ahead, rotate random amount when bumped into something, and loop–to maximize the area being covered by it while cleaning. This works well if the the main function of the robot is to clean the room; there is essentially no limit on the time it takes, and it doesn’t matter too much even if some area is not inspected. However, when we need to generate a 2D map of the room, this can cause some problems. First, moving for a long time, especially including rotations, can introduce more and more error to the odometry. In order to use the map for path-planning, it is important to have a map that is as accurate and detailed as possible. Also, this algorithm doesn’t guarantee a completeness within a finite time. It might be the case that the robot cannot cover enough area within a reasonable time period. Thus, I decide to implement a more formulized behavior for the robot. I am defining the robot to essentially first move back-and-forth parallel to the first wall it detects. This will let the robot travel the room more efficiently. Then, it will do the same thing perpendicular to the first wall. This is to avoid potentially being trapped in a certain area of the room. More testing would be necessary to check the validity of this algorithm.

Deliverables next week

When I get back from the break, I plan to complete the 2D mapping algorithm

Schedule

On Schedule.

Sean’s Status Update for 02/29 (Week 3)

Progress

This week, I worked on the building a mapping algorithm and improving the Roomba control protocols. For the mapping algorithm, I initially intended to use Roomba’s internal “clean” function to scan the room. However, it turns out that I cannot directly control or get sensor reading once it is in the clean mode. Thus, it was necessary to build our own mapping algorithm. First method I plan to test is to make the robot 1) follow the wall, 2) move back and forth in the area in between. This seems to be the method iRobot is using as well. This would be done autonomously. During this process, it became clear that some kind of threading would be necessary. Since the data packet is being streamed constantly, having a thread dedicated to processing the odometry was necessary. I was able to control the Roomba using threading with my local computer, but haven’t tested on the RPi yet.

Deliverables next week

Next week I would focus on 1) completing the mapping algorithm and 2) making sure the control protocol is robust.

Schedule

On Schedule.

Sean’s Status Update for 02/22 (Week 2)

Progress

I had a chance to actually work and play around with Roomba this week. We purchased Roomba 671, which can be controlled using serial communication with a help of their Open Interface.

Understanding Roomba Open Interface:

After inspecting the Open Interface documentation, I was able to figure out the basic serial communication protocol the Roomba expects. There are 3 modes the Rooba can be at: passive, safe, and full mode. Each mode has different limitation to how much control the user has. For instance, the robot is in the passive mode by default. In this mode, we cannot interrupt the movement of the robot (i.e. have control over the motor system). Once we switch to the safe/full mode, we are able to directly drive the robot.

Motor control:

To drive the motor, we send 2-bytes of serial command to the robot. 1 byte for the velocity and 1 byte for the radius of rotation. It is interesting iRobot decided to make their drive method this way instead of more conventional left-right-velocity control.

Issue:

Currently, the sensor reading is unstable. The first issue I encountered was that the sensor reading becomes unreliable when the mode changes. For instance, the robot returns a reasonable encoder value in the safe mode. However, once it switches to another mode (e.g. “Dock” command automatically switches the mode to passive) the encoder value jumps to unexpected value, making the data unusable. I suspected that there is some internal method that resets encoder/sensor reading once the mode changes. So I attempted to work around it by requesting a constant stream of data packets from Roomba. Luckily, this got rid of the first issue. However, I found out that there seems to be some data corruption during the streaming. Sometimes, it returns an unreadable packet with the incorrect header and check-sum byte. I attempted to first ignore the corrupted data, but it seems like they are considerable portion of the returned data. I will look into this problem further in the following week.

Deliverables next week

We were able to accomplish some basic functionalities of the robot. Now, we must integrate the system all together and test it. As of the moment, my goal next week is to:

1. Finish tele-op control
2. RPi headless control of Roomba
3. Integrate Gesture recognition with Roomba control

Schedule

On Schedule.