Esther Jang’s Status Report for 12/4

The past 2 weeks, I worked with the team to fine-tune the overall process of the robot and help reduce errors as much as possible. Most of the errors we ran into were related to drifting, slight battery charge change effects, connection issues, or communication issues. Some of them are reliably debuggable (i.e. battery charge or connection issues), but some are random and hard to pinpoint. Overall, the robot performs somewhat reliably but we seek to continue to improve on the process as much as possible for the final demo.

Ludi Cao’s Status Report for 12/04

The past few weeks I mainly worked with my team to fine tune the robot. As mentioned in the team status report, we improved the computer vision algorithm by fine tuning parameters for the edge detection code. Return towards the basket was also implemented. I also worked on the running tests for the final presentation. Currently my team and I are on schedule in delivering the final project. I would continue working on fine tuning the parameters, as well as on the final report and video

Team’s Status Report for 12/04

For the last two weeks our team worked together to integrate the various subsystems. Attached is a video which demonstrates that our robot can autonomously complete the majority of the work, including navigation to basket and shelf, identifying the pointed object with the laser tag, and successful physical retrieval of the item. In comparison to the last status report, the robot has much more consistent behavior, and most runs are nearly successful. Some solutions include adding an IMU sensor to the robot, so that the robot can orient itself after travelling some distance, to account for drift in the wheels. We experimented on finer tunings of the edge detection code, where the camera only looks at a specific area where the object is located at. This greatly helps in reducing background noise. We also implemented the last step of returning to the basket and dropping the item. Our team also ran some preliminary tests, as well as worked on the slides for the final presentation this last week. During this process we ran into the claw having inconsistent closing behavior, but after tweaking with the heights of the two sides, the performance is much better. Since the claw inconsistencies mostly come from the claw being worn down too much after multiple runs, we also ordered a new claw in case the claw malfunctions before the final demo. 

 

We believe our team is on schedule, and the final delivery of the project is almost finished. Next week our group would mostly focus on the project report, final video, as well as last tweaking of parameters to achieve better consistency. 

https://www.youtube.com/watch?v=hN603kKcYBs

Bhumika Kapur’s Status Report for 12/4

This week I continued to work with my team to improve the entire process of our robot, tweaking some values to make it more consistent, and improving the navigation process.  We primarily focused on the robots navigation back to the basket after it retrieves the object, and depositing the item in the basket. The team’s progress is described in this weeks team status report. I also worked on the final presentation which Ludi presented earlier this week.

Ludi Cao’s Status Report for 11/20

This week our team worked together to integrate the subcomponents of the robot. First, we experimented with the MPU 6050 gyroscope but later realized that the IMU would have a better prediction. We also built the testing shelves for the robot. We then performed integration of moving to the basket, orienting with the shelf, driving up to the basket, recognizing a laser-pointed object, and retrieval of the object. There are still some issues with the computer vision laser-pointing technique as noise is still being picked up, but overall the project is promising. Hopefully, by Thanksgiving break, the laser pointer recognition can be improved, and a preliminary demo can be given.

Esther Jang’s Status Report for 11/20

This week, I worked with the team to do testing as described in the weekly status report. We are currently incrementally programming the various steps of our robot’s navigation process which has required a lot of on-hands testing and discussions for problem solving. With Thanksgiving break approaching, we are hoping to be done with our target implementation soon, which may be possible with our current progress.

Team Status Report for 11/20

This week our team worked on the navigation and retrieval process of the robot. After speaking with Tamal and Tao we reconsidered our robot’s sequence of steps, and worked on the code required for the robot’s navigation and retrieval. This week, we accomplished the following:

  1. We wrote code for our robot to rotate 360 degrees in search for the April Tag on the basket. Once the robot detects the Tag on the basket, it rotates to be perpendicular to the basket, and drives up the basket stopping about 1 foot in front of the basket.
  2. Next we wrote code for our robot to rotate and look for the April Tag on the shelf. Once the tag on the shelf is detected, the robot rotates so it is facing perpendicular to the shelf.
  3. Then, the robot moves around the basket, and drives up to the shelf, stopping about 1 foot from the shelf.
  4. Then the linear slides extend until they reach the maximum height, about three feet.
  5. Once the slides are extended, the robot searches for the laser pointer.

We accomplished the above steps this week, and a video can be seen below:

Currently the robot is sometimes able to center itself to the item with the laser and drive towards it, but this step does not work all of the time. Thus, next week we plan on improving the item detection step, and working on the grabbing the object from the shelf.

 

Bhumika Kapur’s Status Report for 11/20

This week I worked with my teammates on the navigation and retrieval process of the robot. We all worked together on those tasks and our progress is detailed in the team status report. I also worked on improving the CV component of the project as there are some errors that occasionally occur in different lighting conditions, but I am hoping those will be resolved with more testing soon. Next week I will continue to work with my team on the robot and our final presentation,

Ludi Cao’s Status Report for 11/13

At the beginning of the week, I implemented the motor control code for the wheel chassis on the robot. The following video was performed during the interim demo.

After the demo, I worked with Esther and Bhumika to integrate our subsystems. I first experimented with the relationship between the motor spin time and the distance it would travel. The robot can accurately travel a specific small distance without much drift in all directions. Then, we incorporated the computer vision subsystem to work on various navigation sub-components. The robot can move towards an apriltag from an initial position at a tilted angle. The robot would first rotate to face parallel to the apriltag, center itself horizontally, and then move towards the apriltag, based on the location and angle coordinates of the camera. The other subsystem we worked together as a group is laser pointer recognition, centering the robot to the tagged object, and then moving forward towards the robot. Next week, we would add an IMU sensor for more precise movement of the robot in case of drift, and work on the remaining subcomponents: retrieving the item from the shelf and returning to the basket. 

Esther Jang’s Status Report 11/13

Since my subsystems were completed and have been consistently performing as normal since last week, I did not make any changes to the linear slides nor claw subsystems. All of my work for the week was done with the rest of the team to help implement navigation/integration as described in the team’s weekly status report.