Weekly Status Report – Week of 10/27

Arushi – I spent this week focusing on setting up the raspberry pi with its battery such that it can run without having to be plugged into the computer and trying to smooth out the pid. From trying many different things, we were able to identify that if the robot is only programmed to follow forwards and backwards the control is very smooth. However, we realized that once we add in the ability to add turns, the robot becomes more jerky, especially when I move towards the robot and the robot should back up. I think this is the case because we always prioritize pivoting over moving straight. Specifically, if the difference between the center of the frame and the center of the circle is larger than a constant threshold, the robot focuses on turning. However, when I take a step towards the robot, because I am so close up a small shake I make can make the robot think that I am in fact leaving the center of the frame. Thus, we came up with a solution that the turn threshold should be dependent on the target’s distance from the area. We tried implementing this but still ran into similar issues .  So, we will try to gather more data points to identify what this threshold should be and hope to make progress in this by Monday’s checkpoint.

Additionally we realized that currently we always prioritize turning over moving forward, which is not good for certain cases. Thus, we changed our code such that if both or needed it assigns a priority to which ever has a greater value (such as turning) and then works on the other one(moving forward). A consideration was have the robot do both, in a “sweeping-like” motion, but that is not possible in a roomba. Additionally, Shreyas and I started working on an implementing a feature such that if a human moves too quickly and leaves the frame on the right side, the robot will go in a 360 towards the right to find the human and vice versa.

Pallavi – I spent a bit of this week continuing to work with the ultrasonic sensors. I was trying to see if there was any interference between the sensors given the orientation of the sensors onto the board. However, I spent the rest of the week helping Arushi and Shreyas remove the jerky movement of the current project in preparation for the demo next week. Arushi talked about a lot of the changes we tried to make to reduce the jerkiness in her part of the post, something we both realized when we were working on in the lab of Friday was if we try to display the live stream video and run our robot project at the same time, our robot lags too much, so we need to disable that when we want to test the smoothness of our iRoomba. We also are working on pseudo-code for our path planning and our final schedule for the remaining weeks for our project.

This coming week, after our mid-point demonstration, I want to focus on adding more to our iRoomba platform to stabilize the camera and also integrate the platform upon which we will place the sensors, as our next goal is to integrate object detection into our design.

Shreyas – The earlier half of this week I set up SSH capabilities on the raspberry pi so that we could finally remotely control it. This also added the luxury of having more than one of us use the raspberry Pi at once for testing and development. For the latter half of the week Arushi and I focused on making the movement smoother by changing our algorithms slightly and adding features (as detailed in her part above). During our testing, on one run,  the iRoomba went wild and disobeyed any commands we gave it. We realized a quick fix to this is to change the iRoomba to safe mode. Before we had it in full control mode, which prevented itself from stabilizing it self and shutting off. Now in safe mode, it auto stops if it looses stability or if we press the buttons on it. Next week we should also try to make the usb cable more secure on the iRoomba because if that disconnects during a run we could again lose control. This week I also tried adding object detection using the iRoomba sensors. I have the code ready and it works independently of the movement code but when I combine the two the robot’s motion becomes very unstable. This is something I will work on more next week as we thought it would be better to work on the general movement control for the demo.

Here’s a video of our current progress with the Carti-B robot:

 

Leave a Reply

Your email address will not be published. Required fields are marked *