Raymond Ngo’s status Report 4/30

Slower week work-wise, more focused on final testing and final presentation. Unfortunately, we have been slipping behind compared to the schedule due to last second last minute changes that arose due to the difficulty in integration and the environment. For example, we spent time looking at grills with flat bottoms after feedback, we looked at different types of grills (or whether we should use a grill at all), we looked at different power supplies due to the demo environment.

For my part, a lot of the week was spent on integrating the robotic arm with the computer vision, which required a lot of discussions with Joe and a lot of trial and error. I also completed the battery of tests, but have not done integrated testing with all components working together.

Jasper Lessiohadi’s Status Report for 4/23

This week I was not able to do much because I got Covid and most of what I needed to do had to be in person in the lab. Luckily the rest of my team was able to pick up the slack and integrated together several parts of our project. What I did do, though, was start work on the slides for our final presentation. We have made good progress towards them and we will put on the finishing touches tomorrow.

Team Status Report for 4/23

This week, the team ran into a few roadblocks. Jasper got covid, making it difficult to integrate all the parts together, and the Jetson Xavier we were using decided to break down. Fortunately, after a few days, we were able to flash the Xavier and get it up and running again. We lost a bit of progress, but we made up for it over the course of the week. Additionally, despite some difficulties, we were able to integrate the Computer Vision algorithm to detect meat and the UI together successfully. The Inverse Kinematics for the robotic arm are also going well, allowing the arm to move with the desired range of motion. Finally, the team has been making good headway regarding preparing the slides for the final demo.

We are now working on more extensive testing to make sure all the parts are working as smoothly as planned. As was expected, some aspects are not quite as polished as we would like for the final product, but that was something we planned and made time for in our schedule. We are currently right on track with where we need to be and we are optimistic about our project as a whole.

Raymond’s Status Report 4/23

Unfortunately, Jasper was out this week due to covid, but with his input the integration between the UI and the Computer Vision, as well as the integration between the Computer Vision and the Cooking Timer, is complete. The progress we hope to make was delayed several days for integration because the Xaiver broke halfway into the week, requiring several days and a reflash to fix. As a result of that, some progress was lost, but that loss in progress has been reversed with more work.

Furthermore, testing has begun for my computer vision modules for the final presentation. Most of the metrics seem to be fine, however the edge detection might be a bit off in terms of its predictions, probably due to camera quality factors that are hard to resolve. This is unfortunately one tradeoff that cannot be avoided since OpenCV has a max resolution, and any increases in resolution for the image severly impacts negatively everything else in the cv pipeline. Solution is to increase brightness even more, since there is only so much an edge detection algorithm can do when noise is present.

Jasper Lessiohadi’s Status Report for 4/16

This week, tried to get all the major components of our project integrated together. We were able to get the Jetson Xavier up and running with the blob detection and UI running separately, but we had trouble putting the two together. For some reason, we were able to share our internet via ethernet cable early on in this process, but it suddenly stopped working. Because of this we were unable to actually put everything together. We ended up being able to connect to the internet through the CMU wifi router (again using an ethernet cable), but since it was so late in the week, I did not have time to come in and put it together. I plan to do this next week, as I will have much more time to come in and do work. Besides that, I have everything set up on the UI side of things, so once we have it all combined, we will be in a really good spot.

Team Status Report for 4/16

For this week, we have begun to start planning our overall system integration and the specifics of the KBBQ environment.  Joseph has been working on improving the inverse kinematics of the robotic arm.  He is also working on fixing some electrical and mechanical issues the arm had.  Raymond has continued to add images for the CV algorithm but has yet to connect all cameras to the Jetson AGX Xavier.  Jasper has also continued to work on the UI of the system.  They have also bought a wifi router to get the wifi working on the Jetson AGX Xavier.

Joseph Jang’s Status Report for 4/16

I have fixed the mechanical issues with the robotic arm by tightening the screws and filling the spaces between the servo motor arm and the plastic metal arm, but have run into some software and electrical issues.  Sometimes the current draw of the servo motors spikes when the motors are under stall torque, causing the current to spike and voltage to drop.  This leads the motors to stutter or sometimes not even move.  While I don’t think this is a problem with the servo motor controller board (PCA9685),  it could be an issue with the current draw that the lead-acid battery is capable of.  I am holding off buying a lithium-ion battery that has a high current draw to power the servo motor controller board, because this voltage drop and current spike occurred after multiple hours and uses of the lead-acid battery, so the issue might be because the battery is not charged enough.

However, for the inverse kinematics, the accuracy of the system is still sometimes off by a couple of inches when the bounds of the robot’s range of motion are reached.  The Matlab library seems to be more complicated than I need, so I have also tried to use a Python library for inverse kinematics called IKPy.  This requires me to make a URDF file that is commonly used in ROS applications.  In both applications, I am getting the angles I desire, but I have yet to test the IKPy library on the robotic arm.  I am hoping the use of ROS and the URDF file, which is very standardized, can help me achieve some accuracy and precision.  I also think I might have to simulate the entire  KBBQ robot environment (dishes, grill, etc.), which will be easier with ROS.

In other minor updates, I have measured, planned out, and set the locations of the dishes, grill, robot, etc.  This way, the robot will have an easier time being more precise within its range of motions.

Raymond’s status report 4/16

I updated my dataset with more images, including hard to find images of arms over the grill, in response to the feedback from the ethics class about potential harm from the project. The newly created network works. Metrics might need to be shifted a bit in the final report, however, since the categories for anything that is not meat is so broad and the dataset is so limited that there will be limits in terms of accuracy. I will see what I can do.

 

Jasper and I got the ethernet working, however, with the revelation that the final demo will take place in the UC, orders have been placed for both a router and a wifi card. Both orders placed because flexibility in terms of results.

 

I have begun integration of the cooking time algorithm with computer vision. I will need Jasper to integrate the user interface. Unfortunately, Joseph’s lack of completion of work in time for integration has bottlenecked progress and he will need to step up for my part of the scheduled tasks to be completed.

Joseph Jang’s Status Report 4/10

This week I continued to work on the inverse kinematics of the robot.  To help debug the issue, I disassembled parts of the robot to make sure each joint is being turned to the proper angle.  I found several electrical and embedded software bugs.  I do not think the issue with the inverse kinematics is caused by MATLAB’s implementation of the IK solver.  However,  I will also be making use of the ROS IK libraries to code in python.  Another issue I found was the stepper motor is heating up rather quickly even when it is not in action.  Because it is turned on even though no commands are being sent to it, it heats up quickly.  Therefore, I made use of the enable pin, so that the base of the robot will be in place but not turn on when it is unnecessary.  That way the stepper motor will not be overloaded and burned out.  On several occasions, the motor had become very hot, so this was a proper risk mitigation fix.