Matt’s Status Report for 3/15

This week, I completed the code for determining the box’s speed and depth (y-coordinate). This involved continuously tracking the QR codes using the webcam and estimating continuous center position movement over time. I tested it using our Brio3000 webcam on actual boxes moving on the treadmill, and it seems to work accurately.

I plan on completing the Arduino code and the Python/Arduino interfacing code so that we can immediately run our software as soon as the arm is ready. We will also soon start testing our QR / movement code with the arm.

Team’s Status Report 3/15

This week, we completed our qr tracking algorithm for realtime tracking down the conveyor belt. This program also includes code for determining the box’s speed and depth (y-coordinate). We loosely tested it using our Brio3000 webcam on actual boxes moving on the treadmill, and it seems to work accurately. We also wired the robot arm and got it to pick up packages. We tested the voltage feedback for each component and dialed the angles in the Arduino code to have each actuator have an accurate 0 – 180 measurement. We were able to get t1 and t2 to run specific angle moves with feedback.

Our current main concern is the time it will take the robot arm to grab the package from the treadmill. We plan to extensively test the vision program, PC-Arduino connection, and tweak the motors / Arduino responsiveness to get accurate timings measured.

Our next steps are to finish wiring the robot arm’s motors and getting it to run multiple sequential moves (accurately and quickly). We also plan on completing the Python/Arduino interfacing code so that we can immediately run our vision software as soon as the arm is ready.

Marcus’s Status Report 3/15

This week I soldered most of the arm wiring, and connected it up to an Arduino controller. I tested the voltage feedback for each component. The 360 angle sensors were more like 180 so I had to account for dead space in the actuator movement. It took many iterations of Arduino to get the angles dialed to have each actuator have an accurate 0 – 180 measurement. I then wired the motors for t1 an t2. I had to get the direction of rotation and limit switches right  through flipping wires. I was able to get t1 and t2 to run specific angle moves with feedback. I then wired the vacuum and got the robot to do a box pick up move.

 

Raunak’s Status Report for 3/15

This week I helped with integrating the camera system with the treadmill and verifying that QR codes could be scanned. We tested the camera at varying heights and positions on the treadmill and marked the region that we found was the best for positioning (far enough from the head of the treadmill to place boxes and high enough to see the entire width of the treadmill).

Since we have completed a lot of the kinematics software already, I wanted to add an ML component to the project, so I decided to start working on a system that detects objects on a treadmill. My idea was that instead of QR codes labelling the category of each box, which can be tedious in real life, we could have the algorithm determine the object using CV. This is a reach goal for our project because getting an accurate classier for a variety of objects and integrating it into the system is a difficult task. Nevertheless, this week I started coding up the classification algorithm using PyTorch and Torchvision.

IMG_2592.jpeg

Raunak’s Status Report for 3/8/25

This week, we got the computer vision system set up started. Our camera got delivered and we verified that it was able to scan and track QR codes, which is essential to our system working. We also finished testing the kinematic equation solver and verified that it is working completely (via matplotlib simulation). In addition to all of this, I spent a lot of time on the design report, making sure we included a significant amount of detail for each part. I focused on the Abstract, Introduction, Use-Case Requirements and Architecture portions of the report.

Next week I want to start simulating the software on the robotic arm and the camera. For instance, we want to get the kinematic equations to work with the robotic arm and see if it is able to pick up objects. We also want to get the camera working with the CV software and be able to take the scanned QR codes, convert them to numbers and send this information back to the robotic arm so that it knows where to place the box.

Raunak’s Status Report 2/22

This week I continued working on the software for our robotic arm. I tested the kinematic equations to make sure they were working as expected. To do this, I created a bunch of test cases for robotic arm positions and their corresponding angles and verified that our equations solved them correctly. Other than that, I started to work on the design report, adding information from our slide deck as well as the feedback we received. We also ordered and received the camera that we will be using for the box localization, which will be a major part of the project.

Next week, I plan to help set up the camera and integrate it into our system. We want to ensure that we can receive images correctly to our computer that we will eventually pipeline into the CV algorithm. Other than that, if we have time, I plan on working more on the CV code and testing it with the images we get from our camera. This will be a sort of integration test for the entire CV system to ensure that it is working as expected. I also plan on helping finish the design report due next week.

Raunak’s Status Report for 2/15

This week, I made a lot of progress on the software aspects of the project. Specifically, we wrote the basic forward and inverse kinematic equations in Python to derive the position and angle the robotic arms need to be in for a particular object. This is the heart of the software that controls the robotic arm, so finishing the basic code for it this week was a huge accomplishment. I also worked on the design presentation with Marcus for next week, making the suggested improvements from the proposal presentation. Specifically, we worked on making the slides less text-heavy, including more diagrams and formatting the slides better.

Next week, I plan on simulating the kinematic equations solver in Python to verify that it works as expected. Matplotlib  has various simulators that can work for this. We basically just need to verify that the angles and position of the robotic arm are as expected. We also might get started on the CV aspect of this project, which involves recognizing QR codes and telling the robotic arms what to do in each category of QR code.

Matt’s Status Report for 2/8/25

I used this week to start working on coding the inverse kinematics equations for calculating/controlling the robot arm movement. I watched a couple videos on inverse kinematics for robotics (https://www.youtube.com/watch?v=qFE-zuD6jok&ab_channel=EngineerMhttps://www.youtube.com/watch?v=nW5FUVzYCKM&ab_channel=KevinMcAleer), particularly on Python implementations. Using what I learned, I coded some initial inverse kinematics in Python, since we plan on using Matplotlib physics simulations to test while the robot arm is still being completed.

Team Status Report for 2/8/25

The most significant risk that we face is getting the precision of the robotic arm to be accurate enough to locate the boxes. We will need the forward and inverse kinematic equations to be spot on along with the sensor data to be able to locate the box correctly. If this isn’t precise enough, the vacuum suction will not be able to pick up the box and our MVP will not need to be achieved. We are going to manage this risk by focusing on significant simulation for the kinematic equation solver and also getting a vacuum suction with strong enough force to be able to pick up objects even if the location of the arm isn’t extremely precise. Our contingency plan might be able to have the robotic arm punch the boxes off the conveyor belt instead of using suction, which won’t need as much precision.

Marcus’s Status Report for 2/8/25

This week I modeled and printed a prototype arm. The design is 4 axis suction arm with 3 motor axis and  1 servo axis. I routed wires paths through the 3d printed parts and worked to achieve part tolerances. Currently we don’t have all of the hardware to make the arm function: we need angle sensors, servos, wire, suction parts etc.

I would say that we are on schedule to have a working robot arm in the next few weeks.

Next class I want to wire angle sensors to the Arduino and potentially motors to control the arm. I also want to test out the suction gripper. I will continue revising the hardware.