Ethan’s Status Report for 4/30

This week I mainly worked on figuring out how to power the Nvidia Jetson off a battery pack. We determined a sufficient system to use and will be using it to take our system mobile and do some outside testing with actual cars. Next week, we will be conducting more tests. Additionally, I will be working on additional functionality to detect the difference between cars and people.

Ethan’s Status Report for 4/23

This week I finished writing the algorithm which takes both the LIDAR input from the ROS node and the camera and displays a warning on the GUI if an object is detected within a certain range. I got the algorithm working with real LIDAR data and hard coded car position data from the camera. 

Shown above, the red bar indicates the position and range of a car detected within the camera and LIDAR’s field of view.

Next, I will work with the team to integrate our systems together for a full working system.

Team’s Status Report for 4/23

After working to integrate the different systems we’ve been working on. We have a complete fully functional system. We integrated the object detection of the cars with the range detection from the LIDAR system, and created a GUI which displays red boxes on the screen relative to the user which indicates the relative direction and range of the cars approaching. Additionally, we tested the GUI with the AR glasses and the system is working end to end. Next, we will be testing the system to collect data on the accuracy of prediction and the speed at which it can warn the user.

Shown above: for the sake of testing, we configured the YOLO algorithm to detect people instead of cars. The camera is detecting a person in the frame and figuring out the range to the object, and displays the warning as a red square. The system can detect and display multiple objects at once, and will only display a warning if an object of interest is detected by the camera. For example, if a chair is in range of the LIDAR, the system will not warn the user because it is not an object of interest.

Ethan’s Status Report for 4/16

This week I worked on developing a ROS node to subscribe two topics which supply LIDAR data as well as the location data from the camera which tells the ROS node where the car is located in the frame. I worked on integration between the two systems. Next week, now that the camera is working, I will be testing combining the data from the two systems.

Ethan’s Status Report for 4/10/22

Unfortunately I was sick for a few days this week so I was unable to make as much progress on the LIDAR system as I wanted to. After I recovered on Monday, I started writing a ROS node that will accept incoming LIDAR messages and use them in conjunction with the object recognition system to figure out distance to cars. I also tested the LIDAR and collected some range data as shown below. Next week I will be working with my other teammates to integrate the LIDAR and camera together. 

Team’s Status Report for 4/2/2022

This week we split up the work on our project. We worked on both getting the object detection algorithm running on one of the Nvidia Jetsons, and we worked on getting the LIDAR working on the other. Since the object detection algorithm is so heavyweight, we are considering the idea of using both Nvidia Jetsons. One will do the object detection and the other will combine data from the object detection with data from the LIDAR and display the output to the user’s glasses. Next week we will be working on integrating the two systems.

Ethan’s Status Report for 4/2/2022

This week I mostly worked on getting the LIDAR to work with ROS. I was able to install ROS on one of the two Nvidia Jetsons and write a ROS node that would read the Laser scan data from the LIDAR. Next week I will be working on starting to write the code which combines the information from the object detection and the LIDAR and displaying it to the HUD glasses.

 

 

Ethan’s Status Report for March 26th

This week I mostly researched how to setup the LIDAR with ROS. We needed an additional Nvidia Jetson so that we could work on the image detection and range detection at the same time. Therefore, I spent a lot of time configuring and updating the Jetson so that it could be connected to the internet so we can start downloading the libraries we need to run the LIDAR. Next week, I will be installing the LIDAR packages and implement a program in ROS to get LIDAR data.

Ethan’s Status Report for 3/19

This week I mostly worked on doing research on how to interface and get data from the LIDAR. I worked on setting up a linux environment on my macbook to connect to the LIDAR, but this proved difficult. Instead, I have checked out another Nvidia Jetson so that I can directly work on the board that we will be using while my other group members work on getting the camera to work on the other Nvidia Jetson. Next week I will be using ROS to interface with the LIDAR and start recording initial measurements on the Jetson

Team status report for 2/26

This past week we mainly focused on finalizing the details of our design and implementation for the design review presentation. We finalized and ordered the components such as the A1M8 LIDAR, Nvidia Jetson Nano, HUD glasses, and a camera specific for the Nvidia Jetson.

 

Next week we will have the parts we ordered so we can began actually tinkering and integrating the different sensors and components of our system. We will begin trying to take measurements with our LIDAR and camera sensors and continue research into our object detection algorithm. Currently, we are exploring lightweight versions of the YOLO v3 algorithm.