Angie’s Status Report for 3/11

What did you personally accomplish this week on the project?

I met with Professor Swarun Kumar and students at CyLab the week before break to discuss and test a 120 GHz radar. Although angular and range resolution were increased, effective range was greatly decreased to about a meter for detecting humans, which is way below our use case requirements. A day later, Matthew O’Toole responded that a DCA1000EVM (the green board) was available for our use, so we are switching back to the AWR1843. With the team, I also helped generate labels of the target location in range-azimuth space for the neural network and contributed to the design report.

I set up real-time data streaming directly from the AWR1843 without the green board by directly accessing the serial data, so I collected a small range-azimuth and range-doppler dataset including me moving at different ranges and azimuths, which clearly shows in the range doppler plot even when partially obscured by 1 cm-wide metal bars spaced 5 cm apart, but very little difference in the range azimuth plot, which only plots zero-doppler returns. A visualization of part of the dataset is shown below:

However, the data available without the green board is not sufficient for our purposes due to:

  • Very low doppler resolution compared to similar studies in literature
  • Lack of localization for doppler-shifted returns, not just single points of detected objects
  • Cannot separate doppler shifts of returns at the same range but different azimuths

Now that we can use the green board, we will collect 3D range-azimuth-doppler maps that mitigate these issues and allow us to use a 3D-CNN architecture as originally intended, without the significant information loss the week before from reconstructing using just the range-doppler and range-azimuth maps which were the only available radar data from the dataset.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress is slightly behind due to uncertainty about whether the green board was available, which created uncertainty and extra work on characterizing and comparing two different modules. With the green board, I can iterate quicker and catch up due to the availability of real-time raw data.

What deliverables do you hope to complete in the next week?

  • Set up and collect higher resolution 3D data from the AWR1843 with green board
  • Test latency of streaming data from all the sensors through WiFi and adjust data rates accordingly
  • Obtain a part 107 drone license

Team Status Report for 2/25

What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

Risks that could jeopardize the success of the project are the limitations of the radar without the DCA1000EVM to quickly process and stream the raw radar data in real time, which requires a workaround in how the data is stored and sent, such as recording and saving a file, and then streaming the data afterwards, which increases the total time from data collection to wireless transmission to the base station computer to classification by the neural network. In order to fit the time constraint of 3 seconds, lower frame rate data may need to be sent, reducing the quality of the data and possibly reducing the F1 score of the neural network.

Were any changes made to the existing design of the system (requirements, block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

Changes were made to the circuit after assessing available parts, mostly concerning communication between components which do not affect cost. The communication between the radar and the Raspberry Pi was changed from SPI to USB 3 since the radar evaluation module has a USB port, and the communication between the GPS and the Raspberry Pi was changed from I2C to UART because in the GPS/IMU module we bought, the GPS has separate communication from the IMU, which still communicates with I2C. Since no buck converters were available at TechSpark and no estimate of when they will be restocked, a linear regulator, which is less expensive, was used to convert the battery from 9V to 5V, which decreases battery life due to lower efficiency compared to a buck converter. Instead of connecting the temperature sensor output to a transistor gate, the output voltage was simply added to 0.33V for simplicity due to the range of expected output voltages (0.3-1.8V) which coincide with the voltages accepted by the Raspberry Pi’s GPIO pins (0-3.3V).

Provide an updated schedule if changes have occurred.

One week is taken off of the circuit schedule, and is replaced with work on preprocessing the radar data to input into the neural network.

 

How have you have adjusted your team work assignments to fill in gaps related to either new design challenges or team shortfalls?

Since we are communicating internationally with authors of the Smart Robot drone radar dataset for clarification on preprocessing steps such as cubelet extraction of the radar data, which adds time, we are focusing on building the other parts of the project earlier, such as the circuit, so that the overall work is rearranged instead of delayed.

Angie’s Status Report for 2/25

What did you personally accomplish this week on the project? 

This week I worked on and presented the design review presentation, finalized the circuit design for the attachment, and changed how the data will be sent and preprocessed. Also, the AWR1843 radar cannot stream real time data without the DCA1000EVM module, but it can record and save data, so the plan is changed to writing the data to one of two files every two seconds and streaming the other file to provide non-real time but acceptably fast results. I also discussed preprocessing the inputs to the neural network with Linsey and the format of the inputs.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I went a week ahead on the circuit schedule and a week behind on preprocessing schedule, so I am on schedule in total. My actions to catch up on the preprocessing schedule will be to work on processing the radar data into inputs to the neural network instead of working on the circuit.

What deliverables do you hope to complete in the next week?

Next week, I hope to obtain both the GPS and IMU so that I can finalize the plastic chassis design and 3D print it right afterwards. I also plan to do more preprocessing of the data and near-real time wireless streaming of radar data through the Raspberry Pi.

Angie’s Status Report for 2/18

This week, I acquired parts for the standalone attachment, including the Raspberry Pi and temperature sensor. After ordering a GPS module, all required parts for the circuit (others sourced from TechSpark and previous coursework) would have arrived and the whole circuit would be integrated. Literature also confirms that a suitable radome for patch antennas can be 3-D printed with PLA as the material, which is available at CMU. The knowledge to build the circuit was learned in 18-220. I will implement the below simple slotted radome into our 3-D printed chassis (Karthikeya et. al.)

I also confirmed that I can use the drone, but I will also build the standalone attachment for the MVP, adding time to the schedule. Due to focusing on the circuit instead of the radar data, parts in my schedule have swapped since I previously planned to build the circuit after working on processing radar data, so I am behind schedule for data processing. For the next week, I plan to finish implementing real-time data collection from the radar and generating range-doppler maps as input to the neural network.

Angie’s Status Report for 2/11

After consulting with Akarsh Prabhakara from CyLab about our project, we received an AWR1843Boost radar which we could immediately test with. I also consulted Professor Sebastian Scherer from AirLab about drones to use for the project. We also received relevant literature about detecting vital signs with drone-based radar. Before capturing any data that will be used for training, I tested basic functionality by setting up an indoor scene with a 3 m by 3 m cardboard wall and recorded a moving human (me) in front and behind it. Below are the point clouds obtained using constant false alarm rate (CFAR), and Doppler-range plots clearly indicating the moving human both in front and behind the cardboard obstruction. Our progress is on schedule. Next week, I hope to finalize the drone situation, which will inform the decision of whether to order parts. I will continue to collect data and also work to isolate a human’s radar signature from moving background clutter.

Introduction and Project Summary

Our project explores human detection and tracking through drone-based radar imagery. In emergency situations, humans are often hidden from view by fire, fog, and rock, eluding even the piercing gaze of night vision and thermal cameras. Enter radar, which can penetrate these obstructions for a better chance to detect and track stationary and moving humans in any weather condition. Owing to their maneuverability and low cost, drones make an excellent platform for first responders to safely inspect a search area at close range and return imagery from which humans can be automatically detected.