Team Status Report for 02/19/2022

As a team, we think that we are behind schedule. Our critical parts have mostly arrived so we can hopefully verify the setup and environment next week. We hope to have a working development environment with all the necessary libraries and other SW installed by the end of next week. If our parts are somehow damaged or something goes wrong, our contigency plan is to focus on more of the SW side since that does not require any hardware. Our design and specification has changed from last week. The scope of our project has changed to just utilizing LIDAR sensors. We tried to verify that the sensor was working but ran into issues with that. We aim to have this set up and verify that it works by the end of next week as well.

Keshav Sangam’s status report for 2/19/2022

This week was primarily aimed at setting up some environments. Since ROS is not compatible with versions of macOS beyond 10.15 (Mojave), I had to dual boot my computer to install Windows. See here for an explanation why: https://discourse.ros.org/t/macos-support-in-ros-2-galactic-and-beyond/17891

The setup process with Windows is ongoing, so no results as of yet. Thankfully, we know that the Xavier runs a Linux distro and ROS has complete Linux support.

The LIDAR also arrived as of yesterday, but the software developed by Slamtec called RoboStudio comes with its own host of problems. The Slamtec server is based in China, and for some reason, that prevents the RoboStudio application from being able to access the plugin manager necessary to install LIDAR support. Thus, I can’t actually verify that the LIDAR is working. At the very least, the LIDAR is spinning. I would upload a video but I’m getting the following error (happens with mp4 and mov file types):

I believe we are a bit behind schedule. It would be nice to have ROS installed on the Xavier by next week and have the RPLIDAR demo application running.

Raymond Xiao’s status report for 02/19/2022

We slightly pivoted this week with our project now encompassing a smaller scope due to budget constraints. The robot will now be using exclusively LIDAR. I am reading more into the NVIDIA Jetson Xavier NX series data sheet: (https://developer.download.nvidia.com/assets/embedded/secure/jetson/Xavier%20NX/Jetson-Xavier-NX-Series-Data-Sheet_DS10184001v1.8.pdf?RwvrEpSebwaOytLQ66MD9K5ghU9ONWDM977MctsH8irfeqa5z2t5m_6PhbChnOWBGtMBuG0euTPVDpZT9sysELocAQnBVkATGDmTIvpHpM_GXdvoy6w6C5ga3aQCXmlp6pIBdy6kOV56YAQ1NOMsoPrgw5q2ym4TqjyBdhpiEaCKl5SGN3RmhAqzB81a0OphuGlYipjtTcvkamrSTO8&t=eyJscyI6ImdzZW8iLCJsc2QiOiJodHRwczpcL1wvd3d3Lmdvb2dsZS5jb21cLyJ9).

The serial interfaces for this SoC include UART and I2C so we have multiple options to explore. For I2C, I explored libraries to help us define generic interfaces. One promising one is https://luma-core.readthedocs.io/en/latest/interface.html which lets you define a ready to go I2C object and handles all the setup for you. We are still waiting on some parts that have not yet arrived since we pivoted (SD card converter + rechargable battery + LIDAR sensor). I would say we are behind schedule right now but once the above components arrives, we can finally begin working with the Jetson and Roomba. To catch back up to schedule, I am focusing more on the software and trying to gain a better understanding of the Jetson to make setup easier and more efficient.

Jai Madisetty’s Status Report for 2/12

This week, I mainly worked on refining the details for our SAR robot. Specifically, I tried thinking about what components of our project we need for the robot itself, and also for testing the robot’s competency. We have ordered all base components so far (NVIDIA Jetson Xavier and Roomba); however, we still need an SD card and batteries. We also need to order a smoke machine and liquid smoke for testing. We have yet to figure out where exactly we will test this robot so as to not set off any fire alarms; our goal is to figure this out by next week. I am mainly responsible for the software-related aspects of this project, such as implementing SLAM and the A* search algorithm. This past week, I’ve been learning about CUDA as we plan to write fast, parallel code with the NVIDIA Jetson. I have also been looking into an unofficial Roomba sdk that we can use to control the Roomba. Due to the absence of supplies, I am a bit behind schedule. Once we’ve received batteries and an SD card, we can start looking more into the specifics of implementation. We plan to obtain these materials early next week, so we can put in some extra time to setup and get back on track. By next week, I hope to be able to get the Jetson interfacing with the Roomba and being able to control the robot via the Jetson.

Keshav Sangam’s Status Report for 2/12/2022

This week was primarily research oriented. My work revolves around processing sensor data, so since LIDAR and mmWave sensors haven’t been delivered, I focused on learning more about the Kalman filter and its various offshoots.  The extended version of the filter works better in non-linear systems, and thus makes sense to use for our purposes. However, in researching sensor fusion techniques, we came across this article that uses a neural network to interpolate the mmWave data for robust mapping. The  network also has additional features such as a radar-based semantic recognizer for object recognition, but it is unlikely we need such features. The final trained network was also found on Github, so we will have to test the efficacy of this network for our robot to see if we could avoid creating, testing, and optimizing an extended Kalman filter. I ordered the mmWave sensor board that was mentioned in the paper, to further maximize the chances that the network could work for us. My progress is on schedule, but it would be extremely helpful if the sensors arrived ASAP so we could work on sensor-Jetson interfacing. I could play around with and test the sensors personally. Deliverables for next week are dependent on whether the senors arrive. If they do, I hope to have the Jetson able to read in both sensors’ data. If not, I will focus on helping Jai with ROS and controlling the robot from the Jetson.

Raymond Xiao’s status update for 02/12/2022

We have most of our main components ordered already. We are still looking for side peripherals to interact with the Jetson like a SD card as well as mouse and monitor. This week, I have been working on learning how to interface/program the NVIDIA Jetson Xavier we have received. I am responsible for setting up all the connections between the Jetson and sensors + Roomba. Specifically, I am trying to setup the foundation (ROS and OpenCV) on the Jetson. I am a bit behind in progress since I am still waiting on necessary supplies. To catch up to schedule, I plan to devote more time next week. For next week, I hope to have an initial ROS environment set up and some code that can interface (UART) with the Roomba.

Project Summary

There are currently very few robots designed for smokey and fire filled environments. Our goal is to design a SAR robot that can accurately detect a human presence in a smoke filled environment. The scope of our project is to prove that an autonomous robot can speedily and accurately detect humans in dangerous environments.