Raunak’s Status Report for 3/22

This week, we made huge progress on combining all of the working parts together. Namely, the robotic arm is now able to pick up objects off of the conveyor belt, which is the MVP for our project (see slack for video). My contribution to this mainly involved writing Arduino code that controls the arm. This was the piece of the code that enabled our arm to pick up and drop off boxes in their bins. I also continued working on some of the extra features I wanted to add from last week (the ML object detection code) in parallel.

While our robotic arm picks up and drops of boxes correctly, it takes about 15-20 seconds for the entire process of picking up and dropping the box. We want to increase the speed of this overall process and bring it down to ~9 seconds for the entire process. We will also have to test that the arm doesn’t make more mistakes because of the increased speed. Additionally, we want to connect camera system to the robotic arm so that we can pick up objects with the QR code location detection.

Matt’s Status Report for 3/15

This week, I completed the code for determining the box’s speed and depth (y-coordinate). This involved continuously tracking the QR codes using the webcam and estimating continuous center position movement over time. I tested it using our Brio3000 webcam on actual boxes moving on the treadmill, and it seems to work accurately.

I plan on completing the Arduino code and the Python/Arduino interfacing code so that we can immediately run our software as soon as the arm is ready. We will also soon start testing our QR / movement code with the arm.

Team’s Status Report 3/15

This week, we completed our qr tracking algorithm for realtime tracking down the conveyor belt. This program also includes code for determining the box’s speed and depth (y-coordinate). We loosely tested it using our Brio3000 webcam on actual boxes moving on the treadmill, and it seems to work accurately. We also wired the robot arm and got it to pick up packages. We tested the voltage feedback for each component and dialed the angles in the Arduino code to have each actuator have an accurate 0 – 180 measurement. We were able to get t1 and t2 to run specific angle moves with feedback.

Our current main concern is the time it will take the robot arm to grab the package from the treadmill. We plan to extensively test the vision program, PC-Arduino connection, and tweak the motors / Arduino responsiveness to get accurate timings measured.

Our next steps are to finish wiring the robot arm’s motors and getting it to run multiple sequential moves (accurately and quickly). We also plan on completing the Python/Arduino interfacing code so that we can immediately run our vision software as soon as the arm is ready.

Marcus’s Status Report 3/15

This week I soldered most of the arm wiring, and connected it up to an Arduino controller. I tested the voltage feedback for each component. The 360 angle sensors were more like 180 so I had to account for dead space in the actuator movement. It took many iterations of Arduino to get the angles dialed to have each actuator have an accurate 0 – 180 measurement. I then wired the motors for t1 an t2. I had to get the direction of rotation and limit switches right  through flipping wires. I was able to get t1 and t2 to run specific angle moves with feedback. I then wired the vacuum and got the robot to do a box pick up move.

 

Raunak’s Status Report for 3/15

This week I helped with integrating the camera system with the treadmill and verifying that QR codes could be scanned. We tested the camera at varying heights and positions on the treadmill and marked the region that we found was the best for positioning (far enough from the head of the treadmill to place boxes and high enough to see the entire width of the treadmill).

Since we have completed a lot of the kinematics software already, I wanted to add an ML component to the project, so I decided to start working on a system that detects objects on a treadmill. My idea was that instead of QR codes labelling the category of each box, which can be tedious in real life, we could have the algorithm determine the object using CV. This is a reach goal for our project because getting an accurate classier for a variety of objects and integrating it into the system is a difficult task. Nevertheless, this week I started coding up the classification algorithm using PyTorch and Torchvision.

IMG_2592.jpeg

Matt’s Status Report for 3/8

This week, I completed the QR scanning code and successfully tested it using our Brio3000 webcam. In addition to reading the QR code value, I completed the code for getting the QR’s position and size. I also tested this on real-world QR codes using the Brio3000. This code should allow us to estimate a box’s depth and speed in our robotic arm system.

Next week, I plan on testing the QR code on the entire treadmill system. This will involve mounting the webcam above the treadmill and running the QR code scanning, position, and size tests on real boxes that are moving on the treadmill.

Teams Status Report for 3/8/25

This week we worked on the design report and planning out the connection of our algorithms. We discussed using pixel differences in the qr to judge the height of the box on the treadmill. We are working to test the camera and the rest of our electrical components. When we get the robot arm to work were are going to feed it live position control calculate from camera data.

Here is about halfway finished soldering and wiring of the robot arm motor control box. 

 

Marcus’s status report for 3/8

These weeks I worked on the report and wiring the robot. I was able to wire about half of the robot and do tests on one of motors and angle sensors. I was able to create a preliminary Arduino angle feedback control system for 1 motor. One setback is that the “360” angle sensors were actually 250, and I think the overall cause was ESP32 only supporting 3.3v and not 5. This problem might require a switch to Arduino. Otherwise everything going smooth, next week I am going to finish wiring the robot arm and do full movement tests.

 

Raunak’s Status Report for 3/8/25

This week, we got the computer vision system set up started. Our camera got delivered and we verified that it was able to scan and track QR codes, which is essential to our system working. We also finished testing the kinematic equation solver and verified that it is working completely (via matplotlib simulation). In addition to all of this, I spent a lot of time on the design report, making sure we included a significant amount of detail for each part. I focused on the Abstract, Introduction, Use-Case Requirements and Architecture portions of the report.

Next week I want to start simulating the software on the robotic arm and the camera. For instance, we want to get the kinematic equations to work with the robotic arm and see if it is able to pick up objects. We also want to get the camera working with the CV software and be able to take the scanned QR codes, convert them to numbers and send this information back to the robotic arm so that it knows where to place the box.

Marcus’s Status Report for 2/22

This week I moved a treadmill to a stable location and ran speed tests. Additionally I took it apart to see the motor and pulley setup. Planning on ordering electronic components in the next week. I need to ask for where to get bulk 18-22 awg wire. I also made a 3d kinematics visualizer and move solver