Weekly Status Report – Week of 11/3

Arushi – This week I worked with Shreyas and Pallavi to solidify our plans for moving forward, which was important for us to do, especially for the midpoint demo. I made a doc that has information about what we have done so far and documents our plans, both logistically but also in terms of what our algorithms will do for the upcoming weeks. (The doc is included below). We finalized how we are accomplishing turns, using a red target on the arms and acting accordingly (explained in more detail in the doc). So, I spent this week to modify the algorithm. I got the robot to stop once the red circle has been seen, to address what would happen when a person goes to pick up something from an aisle. I’ve begun writing parts of the code that will be used for when the human turns out of the aisle. I thought that may take me longer to test this with the robot because the plan is to finish obstacle detection first, but I knew that I would at least have image processing code tested using my laptop and will then only have to modify the robot commands. Shreyas and I went in and incorporated the robot controls such that as soon as the robot sees a red circle it stops, to address a human picking up items from the aisle.  Moreover, we also began implementing and testing the robot following the human turn out of aisle. For right now, we found the relationship between the radius of the circle and the distance the robot would have to move forward.  I also went in this week, to work with Shreyas to build a platform for the sensors, this way Pallavi can start testing the sensors and the path planning code. In general, this week I had more time so I was able to go in and work on the turns and help build the platform but I know next week will be more busy for me, so I’ll focus on refining the performance and calibration.

Shreyas – This week Carti-B got a major upgrade in performance and looks. Our first goal was getting ready for the Midpoint demo by making the doc below and having enough code working.  After the midpoint demo Arushi and I used the makerspace to build the platform that the sensors would reside on. Then I decided to make Carti-B more aesthetically pleasing and portable by neatly arranging and securing all the individual systems. This helped prevent any wires from coming in the way of the wheels and the raspberry pi from shifting on the base. To secure the usb cable on the iRoomba, I followed Sullivan’s advice and found spring grippers that apply tension on the cable to keep it connected. This week I also got the interrupt handling to work between the Arduino and the Pi.  The arduino sends a signal to a raspberry pi pin when an obstacle is within threshold and the pi uses an interrupt service handler to quickly set a flag in the Robot Control Module that an obstacle is detected. This method is far more efficient than continuously checking the arduino input manually because now the Pi does not have to waste CPU cycles reading the serial data until the ISR is called. I also set up the serial connection interface so that the arduino sends the 4 distance values and the pi parses the string line and extracts these 4 values into an array for later use. The next work regarding this is handling the different cases (which sensor was activated and what path should it take).  My goal is to finish this by next Wednesday. Lastly Arushi and I discovered that because of how I tried to organize the camera cable, sometimes the camera feed is disturbed. To fix this I put in an order request for a shorter cable.

Pallavi – This week I mainly focused on setting up the ultrasonic sensors on the carti-b setup we already have. We noticed some issues in the beginning of the week with inconsistency with the readings from the sensors. The sensors would only read a distance when the obstacle was moving. I noticed the issue was because of two reasons – one, we were ourselves to test the reading, and human beings are apparently inconsistent obstacles as their clothing often absorbs the ping coming out of the sensor. Also, the code I was using with the NewPing library was taken from example code on their website. I read on some forums that this example code had inconsistent results and they gave another start code to use, and once I had changed the code to use this example code, we were able to get consistent results.

After I was able to get consistent results, I setup the sensors on carti-b thanks to the platform Arushi and Shreyas created. The new setup is shown below:

The sensors are held up with tape, so we are going to purchase mounting kits. We also want to purchase a case for the Arduino Uno. Next week, we want to test the integration of the sensor data with our path planning and hopefully have that finished by Wednesday.

 

Midpoint Demo – Team 5 (Cart-i B)

Leave a Reply

Your email address will not be published. Required fields are marked *