As seen in our group update, we unfortunately had our SD card corrupted, so had to reset our Jetson OS and environment for running everything. Fortunately, we had most of our code pushed to Github, so we only had to make minor recoveries for the code.
Before the corruption, I was able to set up the Intel RealSense camera and create a Python script to read color and depth images from it. We decided to transition from the Eys3D camera due to lack of documentation in getting stereo images. Since the SD corruption, I have spent a few hours with the team on recovering our progress, particularly in learning how to use and backup our Docker containers such that we can avoid losing progress if it happens again.
Additionally, I have been in charge of designing and 3D printing the structural harness to mount the Jetson, battery packs, and Intel camera onto the robot, such that it can be fully portable without wall plugs. Since the battery harness is very large, it took 17 hours to print a full version, as well as having to level the printer bed several times (see failed print on the right :/).
Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?
The SD card corrupting did put us back a few days. Our team has been working a lot of additional hours to catch up, but we are still a bit behind schedule.
I think one of the most difficult parts of this project has been actually planning what to do efficiently. Since much of what we are doing is very new to us, being the first time working in ROS and on an NVIDIA embedded computer, we have spent a lot more time than anticipated figuring out exactly what to do and how to achieve it.
What deliverables do you hope to complete in the next week?
Since this coming week is interim demo, we are hoping to have one Hexapod as close to finished as possible. On my side, this involves mounting everything onto the Hexapod and resoldering the power source for the Raspberry Pi.
I will try implementing what I have learned into making the NVIDIA docker script tailored to our specific use case, and also experiment more with VSlam using the Intel camera.