Raunak’s Status Report for 4/26

This week, I helped tweak some of the delays and add a new servo in. I finished writing the report and added information about testing each subcomponent as well as the overall system. I also finalized the bill of materials and Gantt chart.

We are ready for our demo next week to demonstrate our working robotic arm box sorter. We will also try to get some initial feedback on our report.

We did extensive unit testing of our computer vision QR tracker and the robotic arm. Some of our unit tests include testing the reliability of QR code detection, the speed of the robotic arm, the reliability of placing the packages in the correct location,  and end-to-end testing with the entire system. We ran each test about 20 times to ensure that our system was reliable.

Raunak’s Status Report for 4/19

This week, I helped with the end-to-end testing to make sure that we are ready for the demo. With the rest of the team, I helped verify that the arm was able to pick up boxes off the treadmill at 0.5mph and place them in the correct bin. I helped time the runs, and we found that the arm was able to sort the bins at 10 boxes per minute, so we were happy with that. Other than that, I continued making progress on the final report by adding the system implementation, ethics, and summary sections.

Next week, we will focus on the presentation. We are planning to demo the robotic arm, so we need to make sure that it goes smoothly. Once that’s done, we will work on the final poster, video, and report for the following week. We are pretty much done in terms of implementation and are mostly looking to wrap things up well.

Raunak’s Status Report for 4/12

This week, I worked on integration testing with the QR code scanner system and the robotic arm. We realized that the robotic arm was sometimes too slow and sometimes too fast at picking up the packages, so we had to make sure that it was reliable. After tweaking the code, I tested the robotic arm’s pick up reliability and 9/10 times it was able to pick up the moving box. In addition to this, I worked extensively on the final report. We received a lot of constructive feedback for our midway reports, so I wanted to start early on the final report to work on the areas that need improvement. I wrote about half of the report so far including the introduction, use-case requirements, design and implementation sections.

Next week, we plan on doing more testing with the entire system. Specifically, we 3D printed a few boxes of different heights and we want to verify that the system picks up all three boxes reliably. I will also continue working on the final report, taking the feedback from the midway report into account.

Our testing mostly includes integration tests with the entire system. This involves running an end-to-end test by placing a box on the treadmill and seeing if the robotic arm is able to pick up the box correctly. We prefer this over software-only tests because they tell us more information about what we need to improve. Our measured results include the number of times the box is able to be picked up reliably. For example, we test the end-to-end process 10 times and see how many times the robotic arm does its job. Initially, it was only getting it right 1 or 2 times out of 10, but now it gets it right pretty much every time.

Raunak’s Status Report for 3/29

This week, I worked on integrating the QR code detection algorithm with the robotic arm. Previously, we had to send commands manually to tell the robotic arm when to pick up a box. But, this week, we got hooked up the QR code system with the arm, so now it picks up the box when the QR code is detected. I worked on debugging some of the code that was causing the arm to be too delayed in picking up the box. I also worked with the rest of the team to test our system with real boxes moving on the treadmill. We were able to get the arm to reliably pick up moving boxes based on the QR code detection.

Next week, I plan on working on getting the robotic arm to place boxes in different containers based on the QR code. Currently, the robotic arm places all objects in the same location. So, I need to add a bit of logic to tell the arm to place the robotic arm in different locations. Once that’s done, all of our main goals for this project should be covered. I also want to get a good chunk of the final report drafted since we have a good sense of what we need to include at this point.

Raunak’s Status Report for 3/22

This week, we made huge progress on combining all of the working parts together. Namely, the robotic arm is now able to pick up objects off of the conveyor belt, which is the MVP for our project (see slack for video). My contribution to this mainly involved writing Arduino code that controls the arm. This was the piece of the code that enabled our arm to pick up and drop off boxes in their bins. I also continued working on some of the extra features I wanted to add from last week (the ML object detection code) in parallel.

While our robotic arm picks up and drops of boxes correctly, it takes about 15-20 seconds for the entire process of picking up and dropping the box. We want to increase the speed of this overall process and bring it down to ~9 seconds for the entire process. We will also have to test that the arm doesn’t make more mistakes because of the increased speed. Additionally, we want to connect camera system to the robotic arm so that we can pick up objects with the QR code location detection.

Raunak’s Status Report for 3/15

This week I helped with integrating the camera system with the treadmill and verifying that QR codes could be scanned. We tested the camera at varying heights and positions on the treadmill and marked the region that we found was the best for positioning (far enough from the head of the treadmill to place boxes and high enough to see the entire width of the treadmill).

Since we have completed a lot of the kinematics software already, I wanted to add an ML component to the project, so I decided to start working on a system that detects objects on a treadmill. My idea was that instead of QR codes labelling the category of each box, which can be tedious in real life, we could have the algorithm determine the object using CV. This is a reach goal for our project because getting an accurate classier for a variety of objects and integrating it into the system is a difficult task. Nevertheless, this week I started coding up the classification algorithm using PyTorch and Torchvision.

IMG_2592.jpeg

Raunak’s Status Report for 3/8/25

This week, we got the computer vision system set up started. Our camera got delivered and we verified that it was able to scan and track QR codes, which is essential to our system working. We also finished testing the kinematic equation solver and verified that it is working completely (via matplotlib simulation). In addition to all of this, I spent a lot of time on the design report, making sure we included a significant amount of detail for each part. I focused on the Abstract, Introduction, Use-Case Requirements and Architecture portions of the report.

Next week I want to start simulating the software on the robotic arm and the camera. For instance, we want to get the kinematic equations to work with the robotic arm and see if it is able to pick up objects. We also want to get the camera working with the CV software and be able to take the scanned QR codes, convert them to numbers and send this information back to the robotic arm so that it knows where to place the box.

Raunak’s Status Report 2/22

This week I continued working on the software for our robotic arm. I tested the kinematic equations to make sure they were working as expected. To do this, I created a bunch of test cases for robotic arm positions and their corresponding angles and verified that our equations solved them correctly. Other than that, I started to work on the design report, adding information from our slide deck as well as the feedback we received. We also ordered and received the camera that we will be using for the box localization, which will be a major part of the project.

Next week, I plan to help set up the camera and integrate it into our system. We want to ensure that we can receive images correctly to our computer that we will eventually pipeline into the CV algorithm. Other than that, if we have time, I plan on working more on the CV code and testing it with the images we get from our camera. This will be a sort of integration test for the entire CV system to ensure that it is working as expected. I also plan on helping finish the design report due next week.

Raunak’s Status Report for 2/15

This week, I made a lot of progress on the software aspects of the project. Specifically, we wrote the basic forward and inverse kinematic equations in Python to derive the position and angle the robotic arms need to be in for a particular object. This is the heart of the software that controls the robotic arm, so finishing the basic code for it this week was a huge accomplishment. I also worked on the design presentation with Marcus for next week, making the suggested improvements from the proposal presentation. Specifically, we worked on making the slides less text-heavy, including more diagrams and formatting the slides better.

Next week, I plan on simulating the kinematic equations solver in Python to verify that it works as expected. Matplotlib  has various simulators that can work for this. We basically just need to verify that the angles and position of the robotic arm are as expected. We also might get started on the CV aspect of this project, which involves recognizing QR codes and telling the robotic arms what to do in each category of QR code.

Raunak’s Status Report for 2/8/25

This week I was responsible for presenting the project proposal, so I focused a lot on that. I spent a significant amount of time researching various aspects of the project including the software we plan on using for the forward and inverse kinematic equations for the arm movement as well as the computer vision models for QR code detection. For instance, I found this very useful blog post on the implementation of kinematics in C++ (https://medium.com/geekculture/inverse-kinematics-solver-in-c-e999f1b7f353), which is what we plan on working with for the project.  I then practiced presenting this information concisely for the proposal presentation. After the presentation,  I thought about and documented some of the feedback we received. One critical feedback that I documented was that we need to think about whether we want our boxes to have variable or fixed height, which we will have to take into consideration for the kinematic equations. I discussed this with the team and we believe that our scope will be for fixed length boxes as our focus is going to be on getting the robotic arm to work correctly.

Next week we want to begin programming the kinematic equation solver. We believe that this will be the most challenging aspect of the software, so we want to begin as early as we can. Ideally, for next week, we hope to have the forward kinematic solver completed and unit tested. I plan on working with Matt on this implementation and testing the forward solver as well. This won’t be a final test for the functionality as unit-testing can only do so much, but I think this will be a solid first step toward the software aspect of this project. If we get extra time outside of this, we might also start looking into QR code data sets that we can use in our CV model.