This week, I spent the majority of my time working with Meghana to construct the robot as a whole, especially prototyping and finalizing the build for the intake mechanism. We were able to fully attach the intake mechanism to the front of the robot, as well as the intake ramp beneath it. When this construction of the robot was finished, we tested the intake ramp and mechanism when attached to the entire system on the iRobot in order to see the feasibility of the current arrangement. We found that with a high enough speed motor (testing with the power drill), it was possible to intake 3 relatively pristine water bottles without caps. While the placement of the bottles in front of the robot as the intake spins must be at a certain orientation parallel to the intake mechanism, this is still a major breakthrough in progress for our robot. 

 

On the software side, I cleaned up the object tracking code that I have to determine whether a tracked object is to the right or the left of the center point, and once again tested the distance readings of the LiDAR camera with the tracked object. I found that sometimes at distances closer to and slightly above 2 meters, the distance readings sometimes oscillate between 0.0 and a valid value ~2 meters. I will update the code to include tolerance against the invalid 0.0 values, since we will never be exactly 0.0 meters away from the object. Then, I worked with Mae to merge this object tracking portion of the code into the model inference, by allowing the model inference to provide the boundaries for a bounding box to the object tracking code so that it tracks the detected bottle. We tested this integration and were able to get live readings indicating that the tracked bottle was to the right or left of the center of the frame, how many pixels away it was in the camera view, as well as when the tracked bottle was in the center of the frame. We then used this to work on the angle calculation code by using the left and right inputs to determine how much and which direction to rotate the iRobot through UART commands. Through testing, we were able to see that the iRobot could be controlled based on the pixel distance readings to turn left or right until the angle calculation code indicated that the tracked bottle is in the center of the frame. This is a success.

 

I believe that we are making significant progress on the project but are still behind based on our initial timeline. I will work next week to integrate multiple object tracking into the pipeline so that we can also keep track of both bottles and obstacles as we navigate. I also to plan to work next week with Meghana to see if we can cut down on the weight of the robot, which is still a problematic point in the construction.


0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *