Esther Jang’s Status Report for 12/4

The past 2 weeks, I worked with the team to fine-tune the overall process of the robot and help reduce errors as much as possible. Most of the errors we ran into were related to drifting, slight battery charge change effects, connection issues, or communication issues. Some of them are reliably debuggable (i.e. battery charge or connection issues), but some are random and hard to pinpoint. Overall, the robot performs somewhat reliably but we seek to continue to improve on the process as much as possible for the final demo.

Ludi Cao’s Status Report for 12/04

The past few weeks I mainly worked with my team to fine tune the robot. As mentioned in the team status report, we improved the computer vision algorithm by fine tuning parameters for the edge detection code. Return towards the basket was also implemented. I also worked on the running tests for the final presentation. Currently my team and I are on schedule in delivering the final project. I would continue working on fine tuning the parameters, as well as on the final report and video

Bhumika Kapur’s Status Report for 12/4

This week I continued to work with my team to improve the entire process of our robot, tweaking some values to make it more consistent, and improving the navigation process.  We primarily focused on the robots navigation back to the basket after it retrieves the object, and depositing the item in the basket. The team’s progress is described in this weeks team status report. I also worked on the final presentation which Ludi presented earlier this week.

Ludi Cao’s Status Report for 11/20

This week our team worked together to integrate the subcomponents of the robot. First, we experimented with the MPU 6050 gyroscope but later realized that the IMU would have a better prediction. We also built the testing shelves for the robot. We then performed integration of moving to the basket, orienting with the shelf, driving up to the basket, recognizing a laser-pointed object, and retrieval of the object. There are still some issues with the computer vision laser-pointing technique as noise is still being picked up, but overall the project is promising. Hopefully, by Thanksgiving break, the laser pointer recognition can be improved, and a preliminary demo can be given.

Esther Jang’s Status Report for 11/20

This week, I worked with the team to do testing as described in the weekly status report. We are currently incrementally programming the various steps of our robot’s navigation process which has required a lot of on-hands testing and discussions for problem solving. With Thanksgiving break approaching, we are hoping to be done with our target implementation soon, which may be possible with our current progress.

Team Status Report for 11/20

This week our team worked on the navigation and retrieval process of the robot. After speaking with Tamal and Tao we reconsidered our robot’s sequence of steps, and worked on the code required for the robot’s navigation and retrieval. This week, we accomplished the following:

  1. We wrote code for our robot to rotate 360 degrees in search for the April Tag on the basket. Once the robot detects the Tag on the basket, it rotates to be perpendicular to the basket, and drives up the basket stopping about 1 foot in front of the basket.
  2. Next we wrote code for our robot to rotate and look for the April Tag on the shelf. Once the tag on the shelf is detected, the robot rotates so it is facing perpendicular to the shelf.
  3. Then, the robot moves around the basket, and drives up to the shelf, stopping about 1 foot from the shelf.
  4. Then the linear slides extend until they reach the maximum height, about three feet.
  5. Once the slides are extended, the robot searches for the laser pointer.

We accomplished the above steps this week, and a video can be seen below:

Currently the robot is sometimes able to center itself to the item with the laser and drive towards it, but this step does not work all of the time. Thus, next week we plan on improving the item detection step, and working on the grabbing the object from the shelf.

 

Bhumika Kapur’s Status Report for 11/20

This week I worked with my teammates on the navigation and retrieval process of the robot. We all worked together on those tasks and our progress is detailed in the team status report. I also worked on improving the CV component of the project as there are some errors that occasionally occur in different lighting conditions, but I am hoping those will be resolved with more testing soon. Next week I will continue to work with my team on the robot and our final presentation,

Esther Jang’s Status Report 11/13

Since my subsystems were completed and have been consistently performing as normal since last week, I did not make any changes to the linear slides nor claw subsystems. All of my work for the week was done with the rest of the team to help implement navigation/integration as described in the team’s weekly status report.

Team Status Report for 11/13

This past week, the team primarily worked collaboratively to start  autonomizing the robot. We also had demos during class time where we were able to show the subsystems and receive feedback.

Originally, the code that runs the subsystems for the chassis, slide, and claw systems were separate. When we tried combining them together, we realized that the servo library for the Arduino disables 2 PWM pins on the board (after experiencing a strange bug where some motors stopped moving). This meant that we could not run our entire system together across 2 boards since we needed to use all 12 PWM pins for our 6 motors.  We concluded that we either needed to get a servo shield to connect to servo to the Xavier (the Xavier has GPIO pins so the servo cannot be directly connected) or get a larger Arduino board. We were also running into some slight communication delay issues for our chassis with one motor being on a separate Arduino from the others. Hence, we ended up replacing our 2 Arduino Unos with one Arduino Mega 2560 board. Since the Mega board has 54 digital pins and 15 PWM pins, we were able to run all our subsystems on the single board and also eliminated the communication issues across the 2 boards.

For our navigation code, we first focused on being able to get the robot to navigate to an april tag autonomously. Currently, we are relying on being able to rotate and move the robot based on powering the motors on the chassis for a given amount of time. In our tests, we were able to consistently replicate having 1 second of running motors to 0.25m of movement. However, the translational movement is prone to some drift and acceleration over larger distances. Hence, we plan to mostly keep our movements incremental and purchase an IMU to help with drifting disorientations.

Video of 1 second movement to 0.25m translation

Similarly, we found that we were able to fairly consistently get the robot to rotate angles of movement by proportionally associating power on time to our rotational movement.

We then took these concepts and were able to get our robot to navigate to an April tag that is in its field of view. The April tag provides the horizontal and depth distance from the camera center as well as the yaw angle of rotation. Using this information, we wrote an algorithm for our robot to first detect the April tag, rotate itself so it is parallel-y facing the tag, translate horizontally in front of the tag, and translate depth-wise up to the tag. We still ran into a few drifting issues that we are hoping to resolve with an IMU but got results that generally performed well.

Our plan is to have an april tag on the shelf and on the basket so that the robot can be able to navigate both to and from the shelf this way.

We then focused on being able to scan a shelf for the laser-pointed object. To do this, the robot uses edge-detection to get the bounding boxes of the objects in front of it as well as the laser point detection algorithm. It can then determine which object is being selected and center itself in front of it for grabbing.

We tested this with a setup composing 2 styrofoam boards found in the lab to replicate a shelf. We placed one board flat on 2 chairs and the other board vertically at a 90-degree angle in the back.

Video of centering to laser pointed box (difficult to see in the video but the right-most item has a laser point on it):

Our next steps are to get the robot to actually grab the appropriate object and combine our algorithms. We also plan on purchasing a few items that we believe will help us improve our current implementation such as an IMU for drift-related issues and a battery connector converter to account for the Xavier’s unconventional battery jack port (we have been unable to run the Xavier with a battery because of this issue). The camera is also currently just taped onto the claw since we are still writing our navigation implementation, but we will get it mounted at a place that is most appropriate based on our completed implementation. Finally, we plan to continue to improve on our implementation and be fully ready for testing by the end of the next week at the latest.

Bhumika Kapur’s Status Report 11/13

This week I worked on both the edge detection and April tag code.

Firstly, I improved the April tag detection so the algorithm is able to detect an April tag from the camera’s stream, and return the center and coordinates of the tag along with the pose matrix, which allows us to calculate the distance to the tag and the angle. The results of this are shown below:

Second, I worked on improving the edge detection code, to get a bounding box around the different boxes visible in the camera’s stream. The bounding box also allows us to get the exact location of the box, which we will later use to actually retrieve the object. The results of this are shown below:

Finally, I worked with my team on the navigation of the robot. By combining our individual components our robot can now travel to the exact location of the April tag which marks the shelf. The robot is also able to drive up to the exact location of the item which has the laser point on it, and center itself to the object. Over the next week I plan to continue working with my team to finish up the final steps of our implementation.