Team’s Status Report for 12/04

For the last two weeks our team worked together to integrate the various subsystems. Attached is a video which demonstrates that our robot can autonomously complete the majority of the work, including navigation to basket and shelf, identifying the pointed object with the laser tag, and successful physical retrieval of the item. In comparison to the last status report, the robot has much more consistent behavior, and most runs are nearly successful. Some solutions include adding an IMU sensor to the robot, so that the robot can orient itself after travelling some distance, to account for drift in the wheels. We experimented on finer tunings of the edge detection code, where the camera only looks at a specific area where the object is located at. This greatly helps in reducing background noise. We also implemented the last step of returning to the basket and dropping the item. Our team also ran some preliminary tests, as well as worked on the slides for the final presentation this last week. During this process we ran into the claw having inconsistent closing behavior, but after tweaking with the heights of the two sides, the performance is much better. Since the claw inconsistencies mostly come from the claw being worn down too much after multiple runs, we also ordered a new claw in case the claw malfunctions before the final demo. 

 

We believe our team is on schedule, and the final delivery of the project is almost finished. Next week our group would mostly focus on the project report, final video, as well as last tweaking of parameters to achieve better consistency. 

https://www.youtube.com/watch?v=hN603kKcYBs

Team Status Report for 11/20

This week our team worked on the navigation and retrieval process of the robot. After speaking with Tamal and Tao we reconsidered our robot’s sequence of steps, and worked on the code required for the robot’s navigation and retrieval. This week, we accomplished the following:

  1. We wrote code for our robot to rotate 360 degrees in search for the April Tag on the basket. Once the robot detects the Tag on the basket, it rotates to be perpendicular to the basket, and drives up the basket stopping about 1 foot in front of the basket.
  2. Next we wrote code for our robot to rotate and look for the April Tag on the shelf. Once the tag on the shelf is detected, the robot rotates so it is facing perpendicular to the shelf.
  3. Then, the robot moves around the basket, and drives up to the shelf, stopping about 1 foot from the shelf.
  4. Then the linear slides extend until they reach the maximum height, about three feet.
  5. Once the slides are extended, the robot searches for the laser pointer.

We accomplished the above steps this week, and a video can be seen below:

Currently the robot is sometimes able to center itself to the item with the laser and drive towards it, but this step does not work all of the time. Thus, next week we plan on improving the item detection step, and working on the grabbing the object from the shelf.

 

Team Status Report for 11/13

This past week, the team primarily worked collaboratively to start  autonomizing the robot. We also had demos during class time where we were able to show the subsystems and receive feedback.

Originally, the code that runs the subsystems for the chassis, slide, and claw systems were separate. When we tried combining them together, we realized that the servo library for the Arduino disables 2 PWM pins on the board (after experiencing a strange bug where some motors stopped moving). This meant that we could not run our entire system together across 2 boards since we needed to use all 12 PWM pins for our 6 motors.  We concluded that we either needed to get a servo shield to connect to servo to the Xavier (the Xavier has GPIO pins so the servo cannot be directly connected) or get a larger Arduino board. We were also running into some slight communication delay issues for our chassis with one motor being on a separate Arduino from the others. Hence, we ended up replacing our 2 Arduino Unos with one Arduino Mega 2560 board. Since the Mega board has 54 digital pins and 15 PWM pins, we were able to run all our subsystems on the single board and also eliminated the communication issues across the 2 boards.

For our navigation code, we first focused on being able to get the robot to navigate to an april tag autonomously. Currently, we are relying on being able to rotate and move the robot based on powering the motors on the chassis for a given amount of time. In our tests, we were able to consistently replicate having 1 second of running motors to 0.25m of movement. However, the translational movement is prone to some drift and acceleration over larger distances. Hence, we plan to mostly keep our movements incremental and purchase an IMU to help with drifting disorientations.

Video of 1 second movement to 0.25m translation

Similarly, we found that we were able to fairly consistently get the robot to rotate angles of movement by proportionally associating power on time to our rotational movement.

We then took these concepts and were able to get our robot to navigate to an April tag that is in its field of view. The April tag provides the horizontal and depth distance from the camera center as well as the yaw angle of rotation. Using this information, we wrote an algorithm for our robot to first detect the April tag, rotate itself so it is parallel-y facing the tag, translate horizontally in front of the tag, and translate depth-wise up to the tag. We still ran into a few drifting issues that we are hoping to resolve with an IMU but got results that generally performed well.

Our plan is to have an april tag on the shelf and on the basket so that the robot can be able to navigate both to and from the shelf this way.

We then focused on being able to scan a shelf for the laser-pointed object. To do this, the robot uses edge-detection to get the bounding boxes of the objects in front of it as well as the laser point detection algorithm. It can then determine which object is being selected and center itself in front of it for grabbing.

We tested this with a setup composing 2 styrofoam boards found in the lab to replicate a shelf. We placed one board flat on 2 chairs and the other board vertically at a 90-degree angle in the back.

Video of centering to laser pointed box (difficult to see in the video but the right-most item has a laser point on it):

Our next steps are to get the robot to actually grab the appropriate object and combine our algorithms. We also plan on purchasing a few items that we believe will help us improve our current implementation such as an IMU for drift-related issues and a battery connector converter to account for the Xavier’s unconventional battery jack port (we have been unable to run the Xavier with a battery because of this issue). The camera is also currently just taped onto the claw since we are still writing our navigation implementation, but we will get it mounted at a place that is most appropriate based on our completed implementation. Finally, we plan to continue to improve on our implementation and be fully ready for testing by the end of the next week at the latest.

Team Status Report for 11/6

This week we finished the mechanical assembly of the robot, and are preparing for the interim demo for the following week. Bhumika has mainly worked on the computer vision area, with edge detection, laser detection, and april tag detection working decently well. The linear slides system is motorized and can successfully stretch up and down. The claw is experimented fully and can successfully grab objects within dimensions. The navigation code of the robot is under implementation. Currently, the four motors can receive commands from the Jetson Xavier and spin according to direction. We will continue to fine-tune our specific areas of focus and start working on integration later in the week.

Team Status Report for 10/30

This week we received all of the parts that we ordered, and began producing our robot. We setup the Jetson Xavier and Bhumika has been working on the computer vision aspect of the project using that. Ludi was able to assemble most of the chassis. Esther build the linear slides, and both Ludi and Esther are working on getting the motor shields to work. Next week we will continue to work on assembling the robot, and integrating the different components such as the chassis and the linear slides. Overall we are satisfied with the progress we have made so far.

Team Status Report for 10/23

This week, our team finalized our parts order and placed it. We were hoping that at least some of our parts would arrive by the end of the week, but unfortunately did not receive them. We expect to have our parts by the start of the upcoming week and will invest a lot of time to build.

We also had discussions regarding ethics in class and learned a lot in terms of various ethical issues our project may have/cause. For example, we hadn’t considered our laser pointer being misused, our robot damaging the objects it grabs, and if recording others would be problematic.

Team Report for 10/9

This week our team discussed and finalized the hardware components. We composed a bill of materials. With the Intel Realsense taking up almost half the budget cost, our current bill of materials would exceed the limit. We communicated this challenge with the professors, and appreciated greatly their understanding, as well as Tao’s kindness to lend his camera for us to experiment.

To satisfy the current draw requirements, we choose to use the BTS7960 DC Stepper Motor Drive https://www.amazon.com/BTS7960-Stepper-H-Bridge-Compatible-Raspberry/dp/B098X4SJR8/ref=sr_1_1?dchild=1&keywords=BTS7960%20DC&qid=1633569262&sr=8-1, since the motors draw an 8.5 A current at max. Different from the L298N spark fun motor drivers, only one motor can be connected to the motor driver instead of two. Hence, we would need six motors in total. We analyzed the pin layout of the Arduino and motor drivers, and realized that each motor driver would require 4 digital pins and 2 PWM pins that need to connect to the Arduino. Since each Arduino has 6 PWM pins as well as 14 digital pins, we would need at least 2 Arduino boards to connect to all of our components. Conveniently, the Jetson Xavier has 2 5V USB outputs, which can connect to 2 Arduino boards at max. We finalized our selection of the battery as well. Our design should meet the technical and physical requirements, and we are ready to compose our design report due next week. 

Team Status Report for 10/2

This week each team member individually researched the tasks that we were assigned to. Ludi and Esther also worked on cadding the design for the chassis and the claw of our robot. We met with Tamal and Tao on Wednesday, and received feedback from our proposal presentation. A concept that we discussed in depth was using a vacuum grip system rather than the claw for retrieving the object. Esther did a lot of research into this idea, and the team decided that we will proceed with the claw as we are still very unsure about the vacuum grip design.

We also worked on the design presentation and used the feedback we received last week to update some of our requirements. We discussed our current requirements, such as the distance between objects, height of the shelf, and accuracy measurements, and updated those to reflect the needs that arise from our use case.

Team Status Report for 9/25

This past week, we had project proposal presentations during class time. The team finalized the presentation slides on Sunday and Bhumika presented for the team on Wednesday. It was very interesting to hear other groups’ projects, especially as reference for fixing some parts of our proposal that we could have done better.

We also received some feedback and realized that we need to put more consideration into our requirements. The team mostly did separate research on their respective components this week and will reconvene sometime this upcoming week with our findings.

Team Status Report for 9/18

Early Sunday this past week, the team created the project abstract. We met with Professor Tamal and our TA Tao to discuss the abstract on Wednesday. During this meeting, we decided to add an element of the robot autonomously navigating to and from the shelf. We also learned about April tags and are planning to use this to make this navigation process easier. The other elements of our project will primarily stay the same aside from this change. We also have more considerations to make regarding implementation details and what hardware to use.

Aside from our project proposal discussion, we have been working on putting together our presentation slides that are due tomorrow.