Kobe Individual Report 3/16

This week I worked a lot with our camera and our ROS-Isaac environment. In the beginning I focused on interfacing with our usb connected stereo camera. I found which video input channels gave us the visuals and depth and then I tried to utilize GStreamer to create a video pipeline for Yolov8 detection. After consulting with professors that have used GStreamer before, I decided to instead switch to a v4l2 supported opencv method where I create a callback to capture image frames at a certain frequency and then publish it to the topic. I decided to create a separate package for the camera image publishing in our ros environment to modularize our system more and also allow us to use C++ for a speedup. After learning how the USB camera interfacing works I’ve converted it to a C++ node to communicate on the images topic with the Yolov8 node. 

Our progress is admittedly behind due to the fact that we went down various paths to figure out the best way for us to setup our system. While we are behind we have plans to catch back up by pivoting on some design choices. By next week I hope to have our jetson’s ros environment designed with various nodes for our subsystems and I hope to have our object detection fully working.

 

Casper’s Individual Status Report Mar 16 2024

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

I encountered a few blockers this week, which meant I did not get to get as much done as I want unfortunately. Since we are still working on finalizing the controls for the robot, I decided to wait on implementing SLAM. Instead, I have been helping Akash on tuning the Hexapods with the ROS library that we found, but we have decided to pivot as the library does not seem to be super reliable. I have also been doing a lot of learning into Docker containers and ROS, since I am not super knowledgeable, by completing some assignments from relevant robotics courses on my own Jetson Nano.

Additionally, since we are using stereo cameras on our robots now, I designed and printed some retrofit mounts for the camera, which can be seen in the photos.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I think we are falling a bit behind schedule, as we will need to implement SLAM fairly soon. However, we have redefined our MVP to be simpler, such that we do not necessarily need SLAM for the product to work – this means that we will not have the Medbot be able to track other bots. However, we will still endeavour to have this done by demo.

What deliverables do you hope to complete in the next week?

This week, I will help Akash in finalizing the controls algorithm, which involves redesigning the Freenove library. Hopefully after that is complete, I can begin working on SLAM.

Once we decide on the power bank / battery pack for the Jetson, I will also design a mount for that.

Team Status Report 3/16/24

What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

One very significant risk is that our scope might be too large. Getting the controls, object detection and SLAM working on a single robot is a larger feat than we imagined.In particular, it will be difficult to fully implement SLAM in our system. We are managing this risk by talking with our Autonomous robotics professor who is experienced with the Jetson Orin Nano and Issac Ros. For example, he guided us with the setup of the object detection library and told us about the need for using an SSD.  Depending on how much time we have remaining before our final submission, the contingency plan would be to scale down our localization and use April tags in a small defined environment, rather than a generalizable search algorithm that works in any place.

Were any changes made to the existing design of the system (requirements, block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

Not a change but we have realized that we need the Raspberry Pi and cannot eliminate it as libraries that could be used to eliminate its need don’t work effectively. This realization occurred after having spent significant time working towards eliminating the Pi and hence was a costly mistake in terms of. time. However, since we already posses 3 pi’s we don’t require any extra expenditure to use them.

 

Akash’s Individual Status Report 3/16/23

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the week (12+ hours).

After spending a lot of hours in trying to get the ros control library to work (Too many including working on it throughout spring break) I got it completely working. However, we soon realized that it was not good enough for our control requirements and was jittery and buggy. this is because the library was made as a hobbyist project by a full-time SWE. This was devastating but we came to the conclusion that it would be better to just refactor the existing freenove library and control it as we know that that library works.
Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?
I would say I am behind because of the weeks lost going behind the faulty Ros library. This week I have to compensate by working hard to get the existing control library with a server-client architecture working and also get working on the SLAM aspect since the demo date is coming up very soon!
What deliverables do you hope to complete in the next week?
Get the controls library completely functional and work with Casper and setup the SLAM ros library.

Casper’s Individual Status Report Mar 9 2024

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

Over the past week(s), I was able to deploy Isaac ROS YOLOv8 on the Jetson Orin Nano – it took a while to setup all the dependencies as well as understand how to use Docker containers. Isaac ROS YOLOv8 is able to run within a second or so on a simple photo, which is significantly faster than YOLOv7 on the Jetson Nano.

I was also able to complete the simulated search algorithm to include and account for multiple Hexapods, and profile the expected speedups we should see through multiple simulated runs. The example below shows that we should expect to see over a 2x speedup with 3 hexapods. The example on the left simulates the Hexapods without optimized behaviour (walk randomly), and the example on the right simulates the Hexapods with optimized behaviour (avoid places that you have previously walked).

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I think I am more or less on our original schedule. However, since we failed to account for the work we had to do in SLAM and map merging, I think I will need to put in more effort for the second half of the semester.

What deliverables do you hope to complete in the next week?

For this week, I think I will take the initiative and begin implementing the SLAM and map merging algorithms on the robots, which first involves installing the cameras onto the robot and interfacing it with the Jetson Orin Nanos.

Team Status Report for Mar 9 2024

What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?
The main risk we are facing right now is implementing some form of SLAM to allow our hexapods to localize and determine where other hexapods are in the area. Our contingency plan for this risk is to utilize a prebuilt Isaac-ROS april tags library to use april tags for pose estimation of our hexapods instead of having a full SLAM implementation. We think that this would be a lot easier to implement though we have to constrain our scope a bit more.

Were any changes made to the existing design of the system (requirements, block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?
One of the changes made to the existing design of the system is that we’re now using Isaac-ROS on our Jetson Orin Nano. This is because Isaac-ROS supports our Yolov8 object detection algorithm and is already hardware accelerated with the NITROS nodes that it provides. This hardware acceleration is very important for us because we want to make our object detection fast.

Part A: … with consideration of global factors. Global factors are world-wide contexts and factors, rather than only local ones. They do not necessarily represent geographic concerns. Global factors do not need to concern every single person in the entire world. Rather, these factors affect people outside of Pittsburgh, or those who are not in an academic environment, or those who are not technologically savvy, etc.
With consideration of global factors, the main “job to be done” for our design is to provide accessible search and rescue options for countries around the world to deploy. With the rise of natural disasters and various conflicts around the globe, it’s important that we are armed with the appropriate, cost effective, and scalable solutions. Our hexapod swarm will also be trained with a diverse dataset ensuring that we can account for all kinds of people from around the world. Our hexapod’s versatility and simplicity will allow it to be deployed around the world by people with limited technology ability. (Written by Kobe)

Part B: … with consideration of cultural factors. Cultural factors encompass the set of beliefs, moral values, traditions, language, and laws (or rules of behavior) held in common by a nation, a community, or other defined group of people.
With consideration of cultural factors, it is part of our moral belief and instinct to help those in times of need, such that we are also able to receive help when we are vulnerable ourselves. Our product solution is designed to address this common moral, across all different cultures and backgrounds. We will train the object detection models such that it can recognize people from all cultures and backgrounds, and design the product generally so that it can be deployed in many versatile places. (Written by Casper)

Part C: … with consideration of environmental factors. Environmental factors are concerned with the environment as it relates to living organisms and natural resources.
With respect to environmental factors the reason we chose a hexapod robot is so that the robot can traverse a lot of different terrains. We have to do physical testing to ascertain this. We also will account for domestic pets like dogs and cats and have that in our training dataset so we can identify and rescue them as well. We also will test our model with dolls and humanoid lookalikes to check if it doesn’t get confused by them. We have a fault tolerant requirement to account for changing environments that could compromise robots such as water damage or collapsing infrastructure. (Written by Akash)

Kobe’s Individual Status Report 3/9/2024

I worked on understanding the YOLOv8 structure and how to utilize it with the Isaac-ROS based environment. Specifically, I looked deeper into the launching and the scripts written for the visualization of the image detection that we got working before the break. I took the python visualization script and began translating it to C++ since we want the majority of our YOLOv8 code to be running in C++ for the speed. This script creates the visualizer class that subscribes to the detections_output topic and registers a callback for updates on this topic which allow the visualizer to display the bounding boxes. My next steps would be to write a new file for creating a publisher node that would take images from a camera and publish it to the “images” topic that our visualizer would take as an input.

Akash’s Individual Status Report 3/9/2024

I mainly worked on setting up the controls aspect of the robot using a raspberry pi 4. I set up Ubuntu on it and ROS1 to test if the library works. There were a lot of setbacks because we were initially using a pi 3 that does not support ubuntu 20.04 desktop because of less RAM and a bunch of extra setup because ubuntu 20.04 desktop isn’t supported by default on a Pi 4 but its up and working now.

In the future we will either use it as it is or port the ros library into the Jetson and change the ros drivers package to be compliant with the jetson hardware.

Kobe Individual Report 2/24

This week I was mainly focused on setting up and benchmarking different YOLO versions on the Jetson Nano and now the Jetson Orin. The setup process for the Jetson Nano took a longer time because it could not support Python 3.7 or higher with the Jetpack 4.6. To get around this I did a separate setup that got around using the Ultralytics library and I also added a virtual environment. I was successful in setting up YOLOv7 but when running it, I found that it took 30 seconds to detect a few horses in a single image so that’s concerning to us. We decided to pivot and use a Jetson Orin Nano instead, which is 80x faster than the normal Jetson Nano. I spent the rest of the time setting up the Jetson Orin Nano. Here is an image of horses, very cool:

Progress is a tiny bit behind due to our change to Jetson Orin Nano but I think it’s a necessary delay. Next week I want to get the camera and YOLOv8 operational.