Weekly Status Report – Week of 11/10

Arushi – This week I didn’t get a chance to work on Cart-i B too much, which is why I spent more time last week. I worked with Shreyas to change the pid to based off of the radius of the target rather than the area to prevent it from jerking as much when we step close to it/when it would have to move backwards. This really smoothed out its movements in forward and backwards motions. Additionally, we worked on further refining the aisle turn code. Throughout this weekend, I plan on going in with Shreyas to further test and refine the obstacle detection/path planning.

Pallavi – This week I was mainly working with Shreyas to fine tune the obstacle detection now that it is integrated with the entire Carti-B structure. One major thing we changed was the distance threshold we were going to use to fire an interrupt and indicate an obstacle. We realized our threshold was too large, so it was actually detecting the human it was tracking as an obstacle and not moving. Another change made to our design was what information we are actually sending over from the arduino to the raspberry pi. Before we were sending the actual distances for all four of our sensors. We realized, however, our path planning algorithm did not need the actual distances and just what sensor has an obstacle violating the threshold. We achieve this by creating a 4-bit value initialized to 0000. When any sensor violates the obstacle threshold, we check each sensor and if it has an obstacle in the violation range, we set that value to be 1. So, if we had a violation in sensors 1 and 3, we would send 1010. This is to remove any extraneous information that is ultimately not used in our final code.

Next week, I will be out of town for Thanksgiving so I will not be able to work on Carti-B. Happy Thanksgiving!!

Shreyas – This last week I worked with Arushi re-configuring the PID to use the radius instead of the area. This made moving backwards a lot smoother. I also worked with Pallavi to change the format data is sent from the arduino to the pi. I got the interrupt handler to work properly so that it properly sets a global flag of whether or not an obstacle exists. In the loop logic based on the flag, the robot will either move normally, stop, or execute a re-route path to get around the obstacle. After my basic testing Thursday, it works pretty well but more testing needs to be done to tune how much the robot turns to avoid the obstacle and how much the robot moves ahead before looking for the human again. We also decided how we’re going to secure the basket to the robot and what kind of basket to get. We also decided we’re going to buy a cheap medium size black jacket and just tape the green circle and red circles on. This weekend Arushi and I will work on finalizing the obstacle detection protocol.

Weekly Status Report – Week of 11/3

Arushi – This week I worked with Shreyas and Pallavi to solidify our plans for moving forward, which was important for us to do, especially for the midpoint demo. I made a doc that has information about what we have done so far and documents our plans, both logistically but also in terms of what our algorithms will do for the upcoming weeks. (The doc is included below). We finalized how we are accomplishing turns, using a red target on the arms and acting accordingly (explained in more detail in the doc). So, I spent this week to modify the algorithm. I got the robot to stop once the red circle has been seen, to address what would happen when a person goes to pick up something from an aisle. I’ve begun writing parts of the code that will be used for when the human turns out of the aisle. I thought that may take me longer to test this with the robot because the plan is to finish obstacle detection first, but I knew that I would at least have image processing code tested using my laptop and will then only have to modify the robot commands. Shreyas and I went in and incorporated the robot controls such that as soon as the robot sees a red circle it stops, to address a human picking up items from the aisle.  Moreover, we also began implementing and testing the robot following the human turn out of aisle. For right now, we found the relationship between the radius of the circle and the distance the robot would have to move forward.  I also went in this week, to work with Shreyas to build a platform for the sensors, this way Pallavi can start testing the sensors and the path planning code. In general, this week I had more time so I was able to go in and work on the turns and help build the platform but I know next week will be more busy for me, so I’ll focus on refining the performance and calibration.

Shreyas – This week Carti-B got a major upgrade in performance and looks. Our first goal was getting ready for the Midpoint demo by making the doc below and having enough code working.  After the midpoint demo Arushi and I used the makerspace to build the platform that the sensors would reside on. Then I decided to make Carti-B more aesthetically pleasing and portable by neatly arranging and securing all the individual systems. This helped prevent any wires from coming in the way of the wheels and the raspberry pi from shifting on the base. To secure the usb cable on the iRoomba, I followed Sullivan’s advice and found spring grippers that apply tension on the cable to keep it connected. This week I also got the interrupt handling to work between the Arduino and the Pi.  The arduino sends a signal to a raspberry pi pin when an obstacle is within threshold and the pi uses an interrupt service handler to quickly set a flag in the Robot Control Module that an obstacle is detected. This method is far more efficient than continuously checking the arduino input manually because now the Pi does not have to waste CPU cycles reading the serial data until the ISR is called. I also set up the serial connection interface so that the arduino sends the 4 distance values and the pi parses the string line and extracts these 4 values into an array for later use. The next work regarding this is handling the different cases (which sensor was activated and what path should it take).  My goal is to finish this by next Wednesday. Lastly Arushi and I discovered that because of how I tried to organize the camera cable, sometimes the camera feed is disturbed. To fix this I put in an order request for a shorter cable.

Pallavi – This week I mainly focused on setting up the ultrasonic sensors on the carti-b setup we already have. We noticed some issues in the beginning of the week with inconsistency with the readings from the sensors. The sensors would only read a distance when the obstacle was moving. I noticed the issue was because of two reasons – one, we were ourselves to test the reading, and human beings are apparently inconsistent obstacles as their clothing often absorbs the ping coming out of the sensor. Also, the code I was using with the NewPing library was taken from example code on their website. I read on some forums that this example code had inconsistent results and they gave another start code to use, and once I had changed the code to use this example code, we were able to get consistent results.

After I was able to get consistent results, I setup the sensors on carti-b thanks to the platform Arushi and Shreyas created. The new setup is shown below:

The sensors are held up with tape, so we are going to purchase mounting kits. We also want to purchase a case for the Arduino Uno. Next week, we want to test the integration of the sensor data with our path planning and hopefully have that finished by Wednesday.

 

Midpoint Demo – Team 5 (Cart-i B)

Weekly Status Report – Week of 10/27

Arushi – I spent this week focusing on setting up the raspberry pi with its battery such that it can run without having to be plugged into the computer and trying to smooth out the pid. From trying many different things, we were able to identify that if the robot is only programmed to follow forwards and backwards the control is very smooth. However, we realized that once we add in the ability to add turns, the robot becomes more jerky, especially when I move towards the robot and the robot should back up. I think this is the case because we always prioritize pivoting over moving straight. Specifically, if the difference between the center of the frame and the center of the circle is larger than a constant threshold, the robot focuses on turning. However, when I take a step towards the robot, because I am so close up a small shake I make can make the robot think that I am in fact leaving the center of the frame. Thus, we came up with a solution that the turn threshold should be dependent on the target’s distance from the area. We tried implementing this but still ran into similar issues .  So, we will try to gather more data points to identify what this threshold should be and hope to make progress in this by Monday’s checkpoint.

Additionally we realized that currently we always prioritize turning over moving forward, which is not good for certain cases. Thus, we changed our code such that if both or needed it assigns a priority to which ever has a greater value (such as turning) and then works on the other one(moving forward). A consideration was have the robot do both, in a “sweeping-like” motion, but that is not possible in a roomba. Additionally, Shreyas and I started working on an implementing a feature such that if a human moves too quickly and leaves the frame on the right side, the robot will go in a 360 towards the right to find the human and vice versa.

Pallavi – I spent a bit of this week continuing to work with the ultrasonic sensors. I was trying to see if there was any interference between the sensors given the orientation of the sensors onto the board. However, I spent the rest of the week helping Arushi and Shreyas remove the jerky movement of the current project in preparation for the demo next week. Arushi talked about a lot of the changes we tried to make to reduce the jerkiness in her part of the post, something we both realized when we were working on in the lab of Friday was if we try to display the live stream video and run our robot project at the same time, our robot lags too much, so we need to disable that when we want to test the smoothness of our iRoomba. We also are working on pseudo-code for our path planning and our final schedule for the remaining weeks for our project.

This coming week, after our mid-point demonstration, I want to focus on adding more to our iRoomba platform to stabilize the camera and also integrate the platform upon which we will place the sensors, as our next goal is to integrate object detection into our design.

Shreyas – The earlier half of this week I set up SSH capabilities on the raspberry pi so that we could finally remotely control it. This also added the luxury of having more than one of us use the raspberry Pi at once for testing and development. For the latter half of the week Arushi and I focused on making the movement smoother by changing our algorithms slightly and adding features (as detailed in her part above). During our testing, on one run,  the iRoomba went wild and disobeyed any commands we gave it. We realized a quick fix to this is to change the iRoomba to safe mode. Before we had it in full control mode, which prevented itself from stabilizing it self and shutting off. Now in safe mode, it auto stops if it looses stability or if we press the buttons on it. Next week we should also try to make the usb cable more secure on the iRoomba because if that disconnects during a run we could again lose control. This week I also tried adding object detection using the iRoomba sensors. I have the code ready and it works independently of the movement code but when I combine the two the robot’s motion becomes very unstable. This is something I will work on more next week as we thought it would be better to work on the general movement control for the demo.

Here’s a video of our current progress with the Carti-B robot: