Team Status Report 4/26

This week we worked on increasing our accuracy numbers by decreasing movement tolerances. We tested each box orientation on multiple iterations, tweaking our timing and move sequencing. We are working to make our overall move/reset time less so we can do more boxes per minute. Overall, we did not make any major design changes this week, and our schedule remains the same.

To minimize this risk, we have moved wires to the inside of the arm and improved the organization of our wiring containers.

Team Status Report for 4/19

This week we continued testing and improving our end to end system (robot arm + vision/control modules). We improved the smoothness and time-efficiency of box pickups and drops via changes to the main control code. We also fixed some bugs with the PC-Arduino connection, coded handling for various edge cases, improved the accuracy of the control logic / commands sent, and finetuned the command sequences to match the new arm changes. We achieved an item sort rate of 7 per minute, such that were able to reliably sort a box and reset completely within 10s.

We plan to run our unit tests again and really finetune those accuracies, as well as continuing end-to-end testing and focusing very tightly on the command sequencing / PC-Arduino interfacing.

Team status report 4/12

These past two weeks, we have been working on getting the robot to pick up a box every time. We integrated the vision/control algos with the robot kinematic algorithms and were able to reliably pick up multiple boxes in a row. Physically, we updated the arm to have both arm and electronic controllers on the same 3d printed structure/piece. We were able to sync the main PC program’s axes/commands almost perfectly with the arm’s angle quirks to maximize accuracy. We were able to test the system on example boxes running on the treadmill and achieve 80% pickup success (8/10 runs). We have already started experimenting with different pickup move paths / command sequencing, as well as experimenting with different refresh rates and power ramping for the servo. Next week we are going to work on dialing in arm movement and test our system with real drop-off bins.

Team Status Report 3/29

This week we were able to initially integrate all of our components (Arduino arm/kinematics code, Python vision/kinematics/control code). We were able to integrate the CV with the robot arm to track qr codes and determine box position/speed/height, send the robot arm sequential commands, and pick up the moving packages. We were able to pick up multiple packages in a row.

Our current biggest risk is that inaccuracies/imperfections in each submodule will add up to make our end-to-end system insufficiently accurate and/or reliable. Currently our vision module is not perfectly accurate, and our arm and kinematics are also slightly imperfect / idiosyncratic. We are ramping up our unit testing to ensure that these issues are minimized. We are also working on optimizing the PC->Arduino command sequencing to minimizing these risks.

Next steps are adding better move selection and qr tracking. We also plan to more definitively test and map out the arm’s exact angle idiosyncracies. Similarly, we plan to do more robust testing of the vision module’s real-world mappings. Best case, by the end of the week, we hope to have the entire box-grabbing system working in a reliable and decently accurate manner.

Team Status Report 3/22

This week we put it all together and got our robot to pick up packages off a moving conveyer belt. This was accomplished by wiring up the robot fully, troubleshooting a lot of angle sensor / actuator issues, and programming the Arduino to run multiple to-position moves. This serves as a big milestone for us as we have achieved a working robot that can complete tasks. We also tested the CV code for getting the QR value, determining the box height, and determining the box speed/position on our complete system with boxes on the conveyor belt. We also tested our timing/speed/latency, and we definitely should have enough time to successfully scan/track/calculate box variables and move the arm into position to pickup boxes before they fall off the belt.

Our next steps are fine tuning our control algos over the coming weeks to pick up multiple packages quickly. We need to dial in our time tolerances and create bins for the robot to put boxes into. We also plan to finish linking our Python and Arduino code, which will enable us to test the QR scanning software and the robotic arm together. Hopefully we can also start testing the entire system next week, since the scanning system can currently acquire all the necessary metrics, and the robotic arm can currently pick up, move, and drop boxes.

 

Team’s Status Report 3/15

This week, we completed our qr tracking algorithm for realtime tracking down the conveyor belt. This program also includes code for determining the box’s speed and depth (y-coordinate). We loosely tested it using our Brio3000 webcam on actual boxes moving on the treadmill, and it seems to work accurately. We also wired the robot arm and got it to pick up packages. We tested the voltage feedback for each component and dialed the angles in the Arduino code to have each actuator have an accurate 0 – 180 measurement. We were able to get t1 and t2 to run specific angle moves with feedback.

Our current main concern is the time it will take the robot arm to grab the package from the treadmill. We plan to extensively test the vision program, PC-Arduino connection, and tweak the motors / Arduino responsiveness to get accurate timings measured.

Our next steps are to finish wiring the robot arm’s motors and getting it to run multiple sequential moves (accurately and quickly). We also plan on completing the Python/Arduino interfacing code so that we can immediately run our vision software as soon as the arm is ready.

Team Status Report for 2/8/25

The most significant risk that we face is getting the precision of the robotic arm to be accurate enough to locate the boxes. We will need the forward and inverse kinematic equations to be spot on along with the sensor data to be able to locate the box correctly. If this isn’t precise enough, the vacuum suction will not be able to pick up the box and our MVP will not need to be achieved. We are going to manage this risk by focusing on significant simulation for the kinematic equation solver and also getting a vacuum suction with strong enough force to be able to pick up objects even if the location of the arm isn’t extremely precise. Our contingency plan might be able to have the robotic arm punch the boxes off the conveyor belt instead of using suction, which won’t need as much precision.