This week, I continued testing for our final demo and made a lot of tweaks and minor improvements to the vision and main modules. For the vision code, I made the speed calculation even more accurate, shrinking its +/- range by about 30% by more accurately factorizing box height. I also made the main module’s initial pickup time-sequencing more accurate by adding more complexity and minor variables to the calculation process. Additionally, I standardized angle values by updating and using a forwards kinematics function. The main improvements I made were to the timing, pathing, and adjustments of the command sequencing sent to the Arduino, focusing on improving grab rate and speed.
Matt’s Status Report for 4/19
This week, I made a lot of improvements to the main file and how it uses the QR data and kinematics to send commands to the arm. In particular, testing and improving the smoothness and time-efficiency of box pickups and box drops. I also implemented safeguards and handling for scenarios in which the box is out of reach (the arm can’t quite reach the entirety of the treadmill width). Additionally, I tweaked the bin dropoff locations to be much more accurate and fixed some bugs with the Arduino connection. Since we made some major changes to the arm last week, I also spent a lot of time finetuning the command sequences again. Finally, I experimented with changing how the arm picks up boxes and moving the arm in line with the treadmill during pickup instead of moving straight up and down to reduce wear on the gripper.
Matt’s Status Report for 4/12
This week, I fine tuned the vision module’s height and speed calculations to more accurately handle edge cases (box on the edge of the treadmill, tall box, etc.). I also finetuned the angles of the main file’s inverse kinematics outputs to align with the arm’s idiosyncracies (arm plane is slightly tilted). I was able to sync the main program’s axes/commands almost perfectly with the arm’s angle quirks to maximize accuracy. I also tweaked the timing/buffering of how the main file sends commands to the arm to ensure smoothness and reliability during pickup. In particular, it was previously having problems with moving the box up and to the bin after grabbing it. We were able to test the system on example boxes running on the treadmill and achieve 80% pickup success (8/10 runs).
Matt’s Status Report for 3/29
This week, I tested the entire software system with the robotic arm. This involved running the main function, which relied on the QR scanning and kinematics modules to send commands to the Arduino controlling the arm. I was able to successfully verify the general logic of the main function and send accurate, fast commands to the robotic arm’s Arduino. I extended the main file this week, as it needed to be updated with the new additions I made to the QR scanning file and supplemented with the greater understanding I gained of the Arduino control inputs. I also spent a lot of time this week tweaking and improving the main and QR scanner files, as there were numerous issues with the accuracy and precision of predicting the box’s movements and converting the camera readings into real world coordinates relative to the arm (pixels -> centimeters).
I hope to continue testing the software system that runs on PC (vision, kinematics, and command modules) with the arm. I plan to do some more robust testing of the vision module’s real-world mapping as well by placing boxes of different heights at specific locations on the treadmill running at specific speeds. Hopefully I’ll be able to verify that the real-world position/speed mappings are accurate now that I’ve revamped that code. I also hope to at least gain a better understanding of the exact arm angles relative to the command inputs. Best case, by the end of the week, I will have the entire box-grabbing system working in a reliable and decently accurate manner.
Matt’s Status Report for 3/22
This week, I tested the QR scanning code for getting the QR value, determining the box height, and determining the box speed/position on our complete system. Creating physical boxes and attaching various QR codes to them, I then ran them on the conveyor belt on different speed settings with our Brio3000 webcam mounted above. I was able to successfully verify the correctness and accuracy of the code to determine the box value, height, speed, and position.
I hope to finish linking our Python and Arduino code, which will enable us to test the QR scanning software and the robotic arm together. Hopefully we can also start testing the entire system next week, since the scanning system can currently acquire all the necessary metrics, and the robotic arm can currently pick up, move, and drop boxes.
Matt’s Status Report for 3/15
This week, I completed the code for determining the box’s speed and depth (y-coordinate). This involved continuously tracking the QR codes using the webcam and estimating continuous center position movement over time. I tested it using our Brio3000 webcam on actual boxes moving on the treadmill, and it seems to work accurately.
I plan on completing the Arduino code and the Python/Arduino interfacing code so that we can immediately run our software as soon as the arm is ready. We will also soon start testing our QR / movement code with the arm.
Matt’s Status Report for 3/8
This week, I completed the QR scanning code and successfully tested it using our Brio3000 webcam. In addition to reading the QR code value, I completed the code for getting the QR’s position and size. I also tested this on real-world QR codes using the Brio3000. This code should allow us to estimate a box’s depth and speed in our robotic arm system.
Next week, I plan on testing the QR code on the entire treadmill system. This will involve mounting the webcam above the treadmill and running the QR code scanning, position, and size tests on real boxes that are moving on the treadmill.
Matt’s Status Report for 2/22
This week, I completed the initial QR code scanning code and successfully tested it on an online QR code dataset. I also improved the main.py file to integrate both the vision and kinematics models and send out according placeholder commands to the Arduino.
Matt’s Status Report for 2/15
This week, I completed the code for the kinematics equations in Python, wrote a code outline for connecting to and communicating with the Arduino, and helped Raunak start the code for QR code scanning. I also contributed to creating the design review presentation; specifically, I helped with writing the low-level breakdowns and images.
I’m working to finish the QR scanning code and get it using real QR code test data sets. Ideally, I also hope to get it working on real-life QR codes that we print out, since our camera should arrive next week.
Matt’s Status Report for 2/8/25
I used this week to start working on coding the inverse kinematics equations for calculating/controlling the robot arm movement. I watched a couple videos on inverse kinematics for robotics (https://www.youtube.com/watch?v=qFE-zuD6jok&ab_channel=EngineerM, https://www.youtube.com/watch?v=nW5FUVzYCKM&ab_channel=KevinMcAleer), particularly on Python implementations. Using what I learned, I coded some initial inverse kinematics in Python, since we plan on using Matplotlib physics simulations to test while the robot arm is still being completed.