Casper’s Individual Status Report Apr 6 2024

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

We were able to establish a lot of progress over this past week before our interim demo! With our reflashed Jetson, Akash and I were able to create some shell scripts that automates the process for setting up our ROS environments in the Docker containers. With this code in our Github, we are able to easily setup new Jetsons with the correct dependencies for YOLOv8.

Additionally, I was able to write a simple tracking algorithm such that it is able to turn towards and follow humans up to a certain distance. After mounting battery packs and Jetson onto the Hexapod so that it is fully mobile, I was able to test that this algorithm worked (which can be seen in the group updates video)!

What deliverables do you hope to complete in the next week?

Over this coming week, I hope to have fully implemented VSLAM on our Jetson, in order to determine whether it is feasible or not. If we decide to proceed with VSLAM, I also hope to add to the shell scripts to automate the setup for this package.

I will also help Kobe with the overall state control and search algorithm since I wrote the simulation code for it. Additionally, I will assemble and manufacture our remaining Hexapods so that we have swarm SAR.

Team Status Report 4/6

What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

We have made good progress with the Hexapod for the interim demonstration, where it can track and follow people. However, we are still yet to implement a proper search/path planning algorithm to find people in the first place, which is an important component of our project.  To mitigate this risk, we have many ideas for search algorithms of varying levels of difficulty, so that we can always fall back on a simple implementation if needed. Our ideas range from using SLAM map data to track all explored and unexplored areas, to simply walking straight when possible and turning in a random direction when blocked.

For localization and mapping, we are almost finished with setting up VSLAM using live data from the RealSense cameras. However, there is still worry that VSLAM may be too compute heavy, both in terms of compute resources and also memory requirements (since offline data can be gigabytes to terabytes large), so this might not be feasible. As a contingency plan, we might be able to simply use the movement commands we send to create a map, as it seems that the drifts on the Hexapod robots are tolerable.

Were any changes made to the existing design of the system (requirements, block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

As discussed above, we might need to replace SLAM from our system requirements if we deem it is computationally infeasible. This means the robot will not be as accurate in localization due to drift, but will have more computing power for object detection and other algorithms.

We also made some additions and changes to our hardware, such as powering the RPi using an additional battery pack, so that all the 18650s are dedicated to powering the servo motors for longer life.

Provide an updated schedule if changes have occurred.

Our schedule is actively updated in our Gantt chart, which can be seen in this spreadsheet.

Now that you have some portions of your project built, and entering into the verification and validation phase of your project, provide a comprehensive update on what tests you have run or are planning to run. In particular, how will you analyze the anticipated measured results to verify your contribution to the project meets the engineering design requirements or the use case requirements?

Controls test (already done): After setting up the hardware and software for the Hexapods, we tested that the movement and controls algorithm works as intended.

Search algorithm test: We will test that the hexapods are able to navigate around a room with obstacles without colliding. We can do so by cordoning off a section of a room and addingobstacles in front of the robot’s paths.

People detection test: We will test that our object classifier is able to detect people in a variety of poses, lighting conditions, and other environment variables. We will use test data that we took ourselves or from online.

Mock real-world test: We will create a 5m x 5m space / room, with walls and other obstacles as a simulation of a real-world scenario that the hexapods might find themselves in. We will test that the hexapods are able to navigate through the room and find all humans (ie. human figurines or images).

Casper’s Individual Status Report Mar 30 2024

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

As seen in our group update, we unfortunately had our SD card corrupted, so had to reset our Jetson OS and environment for running everything. Fortunately, we had most of our code pushed to Github, so we only had to make minor recoveries for the code.

Before the corruption, I was able to set up the Intel RealSense camera and create a Python script to read color and depth images from it. We decided to transition from the Eys3D camera due to lack of documentation in getting stereo images. Since the SD corruption, I have spent a few hours with the team on recovering our progress, particularly in learning how to use and backup our Docker containers such that we can avoid losing progress if it happens again.

Additionally, I have been in charge of designing and 3D printing the structural harness to mount the Jetson, battery packs, and Intel camera onto the robot, such that it can be fully portable without wall plugs. Since the battery harness is very large, it took 17 hours to print a full version, as well as having to level the printer bed several times (see failed print on the right :/).

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

The SD card corrupting did put us back a few days. Our team has been working a lot of additional hours to catch up, but we are still a bit behind schedule.

I think one of the most difficult parts of this project has been actually planning what to do efficiently. Since much of what we are doing is very new to us, being the first time working in ROS and on an NVIDIA embedded computer, we have spent a lot more time than anticipated figuring out exactly what to do and how to achieve it.

What deliverables do you hope to complete in the next week?

Since this coming week is interim demo, we are hoping to have one Hexapod as close to finished as possible. On my side, this involves mounting everything onto the Hexapod and resoldering the power source for the Raspberry Pi.

I will try implementing what I have learned into making the NVIDIA docker script tailored to our specific use case, and also experiment more with VSlam using the Intel camera.

Kobe Individual Status Report 3/30

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficient
At the beginning of this week I spent most of my time trying to integrate the original eYS Stereo camera that we had into our system via getting the depth data and using the camera for VSLAM. We soon found that this camera had really poor documentation and did not work well with what we wanted to do. We pivoted to setting up a real sense camera in its place. I mostly worked on making the code for the camera and hexapod communication into ROS nodes that worked with our yolov8 visualizer. On Wednesday however we realized that our SD card got corrupted with set us back a lot. I spent the rest of the week setting up our environment again and restoring our code. I finished integrating the real sense camera with the image pipeline after some trial and error. 

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

The corrupted SD card set us back a few days but we have recovered now. On the bright side we are still not too far behind schedule and the restoring of the SD card allowed us to learn more about dockers, using SSDs, and our environment in general.What deliverables do you hope to complete in the next week?
Finish a hexapod that can do basic maneuvering, object detection, and object following.

Akash’s Individual Status Report 3/30

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficient
A lot of accomplishments got robot controls completely done and integrated, soldered our power connector to power on our hexapod. Our SD card got corrupted so grinded to get our code back to the previous state on an SSD instead. Definitely put like 40+ hours this week lol. Figured out a lot of our previous weird docker container behavior such as aliasing causing us problems.
Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

We are barely on track since we finally got our code back to the state it was pre-SD card corruption. So we are barely on track right now. Just need to keep grinding.
What deliverables do you hope to complete in the next week?
Finish one complete hexapod and get inter hexapod communication code working to an extant.

Team Status Report 3/30

What are the most significant risks that could jeopardize the success of theproject? How are these risks being managed? What contingency plans are ready?
The most significant risk is not being able to integrate multiple robots in time. The risks are gonna be managed next week after we finish one robot we’ll split into having some of us working on the communication code and some of us building the robot.
Were any changes made to the existing design of the system (requirements,block diagram, system spec, etc)? Why was this change necessary, what costsdoes the change incur, and how will these costs be mitigated going forward?Yes, for one we added an SSD to the Jetson. This was necessary because our SD card got corrupted and we lost our data :(. We learned the hard way not to have our docker containers running on an SSD. A difference that was a deviation from our previous design is that we keep the raspberry Pi and uses web sockets to command the pi from the Jetson. We have this completely integrated so we can control the robot very effectively.

Provide an updated schedule if changes have occurred.
A rough change would be getting interhexapod coordination done between two hexapods in 2 weeks. This should be doable as because of our setbacks we have become pretty fast as setting up our Jetson and raspberry pi to work with our code.

Casper’s Individual Status Report Mar 23 2024

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

Over this past week, I was able to get Isaac ROS VSLAM running on our Jetson Orin Nano, using example data provided from a RosBag (see image below). We have verified that VSLAM does work through the visualizations, but are yet to verify this using our own data.

Furthermore, I have been working together with Kobe to develop our master controller node, which handles the overall flow logic of the robot (ie. transitioning from search state to find state, the actions to do within each state). This has involved

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

We are still a bit behind schedule, as we had originally planned to begin testing and integration by now. So, we will have to eat up some of our slack time.

What deliverables do you hope to complete in the next week?

Over this next week, I hope to complete the controller node such that we can have all the software (excluding VSLAM for now) fully integrated, such that the robot is able to move around and search for objects.

Akash’s Individual Status Report 03.23.24

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the week (12+ hours).

Got the controls library working and fixed the batteries. Figured out the battery issue was caused by the batteries discharging too much since they are unprotected so created a pseudo-BMS (battery management system). Following a server-client architecture with a client on the jetson that sends commands to the Raspberry Pi using tcp.
Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

On track right now since we are very close to integrating the controls with the overall functionality. Need to test on campus wifi soon and finish the integration next week.
What deliverables do you hope to complete in the next week?
Finish integration and get ready for the demo.

 

Team Status Report 3/23

     The most significant risk that could jeopardize the success of the project is still related to our method of doing obstacle avoidance in combination with SLAM. We have multiple plans to pivot from SLAM if the data proves to be too difficult to use. One possible solution would be to gather the SLAM data but rely on our ultrasound sensor for obstacle avoidance. In this case our SLAM data could be collected and brought back to a central computer that a human SAR team member uses. This data could then be used to visualize the target area, allowing human team members to better traverse the terrain. We are a bit behind schedule but we hope to catch up soon as we are starting to implement the search algorithm on our actual robot. We made a good amount of progress in the past week as we got object detection and the hexapod controls working. These will be further talked about in the individual reports.

Kobe Individual Status Report 3/23

This week I was able to create the stereo camera to yolov8 detections pipeline. Specifically, I made an isaac ros node that was in charge of interfacing with the camera using OpenCV. I think I translated the OpenCV frames into ros2 image messages via CVBridge. I published these ros2 image messages and the yolov8 tensor rt node ran object detection on these messages. The visualizer node showed us that the object detection was working well. Casper and I are currently in the process of creating a central control node that will take the various results from object detection and other sensors in order to coordinate the behaviors of the hexapod. A lot of the time spent this week was debugging and figuring out alternatives to deprecated packages or incompatible libraries…etc. 

Now that the object detection pipeline is working, I’m more on schedule but still a bit behind since we should be implementing the search algorithm right now. This should not be too big of a hurdle to overcome since we are starting with a simple search algorithm. Over the next week I’m hoping to get the hexapod to be able to move toward search targets like a human.