Jasper Lessiohadi’s Status Report for 4/23

This week I was not able to do much because I got Covid and most of what I needed to do had to be in person in the lab. Luckily the rest of my team was able to pick up the slack and integrated together several parts of our project. What I did do, though, was start work on the slides for our final presentation. We have made good progress towards them and we will put on the finishing touches tomorrow.

Team Status Report for 4/23

This week, the team ran into a few roadblocks. Jasper got covid, making it difficult to integrate all the parts together, and the Jetson Xavier we were using decided to break down. Fortunately, after a few days, we were able to flash the Xavier and get it up and running again. We lost a bit of progress, but we made up for it over the course of the week. Additionally, despite some difficulties, we were able to integrate the Computer Vision algorithm to detect meat and the UI together successfully. The Inverse Kinematics for the robotic arm are also going well, allowing the arm to move with the desired range of motion. Finally, the team has been making good headway regarding preparing the slides for the final demo.

We are now working on more extensive testing to make sure all the parts are working as smoothly as planned. As was expected, some aspects are not quite as polished as we would like for the final product, but that was something we planned and made time for in our schedule. We are currently right on track with where we need to be and we are optimistic about our project as a whole.

Jasper Lessiohadi’s Status Report for 4/16

This week, tried to get all the major components of our project integrated together. We were able to get the Jetson Xavier up and running with the blob detection and UI running separately, but we had trouble putting the two together. For some reason, we were able to share our internet via ethernet cable early on in this process, but it suddenly stopped working. Because of this we were unable to actually put everything together. We ended up being able to connect to the internet through the CMU wifi router (again using an ethernet cable), but since it was so late in the week, I did not have time to come in and put it together. I plan to do this next week, as I will have much more time to come in and do work. Besides that, I have everything set up on the UI side of things, so once we have it all combined, we will be in a really good spot.

Jasper Lessiohadi’s Status Report 4/10

For our interim demo last Wednesday, I worked on the UI a bit more. It was not exactly where it needed to be, but I will work on that and integrating all of the pieces of our project in the next few days. Besides that, since we just had Carnival, I have made no other progress.

Jasper Lessiohadi’s Status Report 4/2

As I wrote in the team status report, this week I finished work with the UI and cooking time algorithms. The UI is hopefully clear enough to provide a clean and intuitive experience for the user. It displays information about the detected thickness of a new piece of meat, how long the system decides it needs to cook, and which section of the grill it will go on. There is also an overhead camera feed of the grill which displays which section is which, enabling the user to be confident that the system of working as intended. I plan to add the capability of changing the remaining cooking time for any section that they select, but I do not have that working yet.

Team Status Report 4/2

This week, the team has been diligently working towards being ready for our project demo on Wednesday 4/6. We seem to be in a very good spot with all of our subsystems in a presentable point.

Joseph can demonstrate the robotic arm we plan to use to flip and transport the meats being cooked. The inverse kinematics of it have been difficult to implement, but at least we will be able to show that it moves with the range of motion that we need for our end goal. We have the wiring and other heat-sensitive parts insulated so that they will be protected from the high temperatures of the stove. This has been tested using a soldering iron which did not have any effect on the more delicate internals of the arm.

Raymond has blob detection working with the CV able to detect meats with decent accuracy. He has switched from image classification to object recognition to prevent scenarios where a user places 2 different types of meat on a plate. He has changed parameters of the original blob detection algorithm to be better at identifying meats instead of other objects in the scene, along with improving performance in low light conditions.

Jasper has finished work with the UI and cooking time algorithms. The UI is hopefully clear enough to provide a clean and intuitive experience for the user. It displays information about the detected thickness of a new piece of meat, how long the system decides it needs to cook, and which section of the grill it will go on. There is also an overhead camera feed of the grill which displays which section is which, enabling the user to be confident that the system of working as intended. We plan to add the capability of changing the remaining cooking time for any section that they select, but we do not have that working yet.

Overall, the project is keeping up with the proposed schedule from earlier in the semester, and the team has made great progress.

Jasper Lessiohadi’s Status Report for 3/26

This week, progress was a bit slow, due to lots of work from other classes and other interference from life. Additionally, I am not as familiar with python as I previously thought I was. I have been using the Python library tkinter, and have something basic, but it still doesn’t look great. I am confident that it will be fine by the time we have our interim demo, though. I will have to put in a bit of extra work this upcoming week, but it is nothing that I won’t be able to handle.

Jasper Lessiohadi’s Status Report for 3/19

We received all of our supplies that we requested, so we can now try to see how all of our parts fit together. I am still currently working on the UI because I am not yet satisfied with how everything is arranged. I am playing around with the button shapes and locations, and how to best show the sections of the grill through the camera feed. Getting the camera feed itself to show up has been a little bit of an issue as well, but I do not think it will be too complicated to fix this. I think that the main issue will be having the UI, robotic arm, and CV interact correctly with each other, but we will make it work by the time the demo comes around.