Kobe Individual Status Report 3/23

This week I was able to create the stereo camera to yolov8 detections pipeline. Specifically, I made an isaac ros node that was in charge of interfacing with the camera using OpenCV. I think I translated the OpenCV frames into ros2 image messages via CVBridge. I published these ros2 image messages and the yolov8 tensor rt node ran object detection on these messages. The visualizer node showed us that the object detection was working well. Casper and I are currently in the process of creating a central control node that will take the various results from object detection and other sensors in order to coordinate the behaviors of the hexapod. A lot of the time spent this week was debugging and figuring out alternatives to deprecated packages or incompatible libraries…etc. 

Now that the object detection pipeline is working, I’m more on schedule but still a bit behind since we should be implementing the search algorithm right now. This should not be too big of a hurdle to overcome since we are starting with a simple search algorithm. Over the next week I’m hoping to get the hexapod to be able to move toward search targets like a human.

Leave a Reply

Your email address will not be published. Required fields are marked *