Angie’s Status Report for 4/29

What did you personally accomplish this week on the project?

  • This week, I worked with Linsey and Ayesha to integrate the GPS with the Raspberry Pi. We could not get the GPS to fix on satellites indoors, so the GPS was able to be fixed by placing it outdoors for nearly an hour. The initial results are that the GPS data is precise (majority of locations are within 2 meters of the average location, and the standard deviation is 1.5m), but inaccurate, being about 20 miles away from the actual location. To maximize the location accuracy, the location shift will be compensated for, and the latitudes and longitudes will be filtered over time to reduce the influence of outliers.

  • I collected 3600 more samples of data to train the neural network in hopes of increasing the F1 score. Heeding feedback from the final presentation, I recorded the movement of non-human animals such as a wasp. The human data contained more representation of relatively stationary movements such as deep breathing, which decreased the F1 score to 0.33 again, so I collected data where the humans are dramatically moving (including behind barriers) such as arm-waving.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress is behind on GPS integration due to lack of connection with CMU-DEVICE and trouble getting GPS fix, and I will catch up by resolving errors specific to slow GPS fix time and not remembering its position between reboots.

What deliverables do you hope to complete in the next week?

  • Resolve problems with getting a GPS fix quickly (such as downloading almanacs from internet)
  • Test system on CMU-DEVICE
  • 3D print the chassis and put all parts in chassis

Angie’s Status Report for 4/22

What did you personally accomplish this week on the project?

  • This week, I collected a new dataset of 3600 samples (1800 with humans, 1800 without humans) which was used to train the neural network. Compared to the old dataset, the new dataset has doubled velocity resolution and halved range to 5 meters, which is advantageous for our use case since the data beyond 5 meters is superfluous and increases latency.
  • I collected 600 samples of test data (300 with humans, 300 without humans) which is not used to train the neural network but to gauge its performance with data that was never used to train it.
  • The above test data, along with the real-time data, is preprocessed as shown in the picture below:

  • Ayesha and I wrote code to send radar, GPS, and IMU data from the Raspberry Pi through an http request.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress is on schedule

What deliverables do you hope to complete in the next week?

  • Work with Ayesha to write code to send temperature data
  • 3D print the chassis to contain the system
  • Collect metrics on latency for the integrated system

Angie’s Status Report for 4/8

What did you personally accomplish this week on the project?

  • During the interim demo, I demonstrated the radar subsystem collecting range-doppler data in real time.
  • Together with Linsey and Ayesha, we collected and labeled radar data consisting of indoor scenes with or without humans in different orientations at different positions in different poses (standing, sitting, lying on ground) performing different tasks (hand waving, jumping, deep breathing). In total, we collected 25 ten-second long scenes at a rate of four frames per second, leaving us with 1000 labeled scenes.
  • Ayesha and I started integrating the GPS and IMU data with the Raspberry Pi by writing and testing socket code for the Raspberry Pi to send and the web app to receive data over WiFi.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress is on schedule, according to the updated schedule presented at the interim demo.

What deliverables do you hope to complete in the next week?

  • Integrate web app with radar, location, and temperature data and conduct corresponding integration tests for latency, data rate, etc.
  • Train neural network on our own dataset
  • Continue exploring radar preprocessing such as denoising and mitigation of multipath returns

What tests have you run and are planning to run?

  • Radar subsystem testing: The radar meets our use case requirements, being able to clearly register a human with moving arms from within 5 meters. The radar is able to do so even when the human is obstructed by glass and plastic or partially obstructed by metal chairs. However, it would be good for us to collect more radar data of moving non-human objects to test how well the radar can specifically distinguish humans from non-humans, measured by its accuracy and F1 score.
  • Integration testing: The radar and GPS modules are able to stream data directly to the Raspberry Pi via UART at 4 frames per second. To meet the use case requirements, the web app must be able to receive all that data at the same rate via WiFi within 3 seconds of collection. This metric will be tested after the server is written to receive data from our system.

Angie’s Status Report for 3/25

What did you personally accomplish this week on the project?

This week, I tested integration of the Raspberry Pi with the other peripheral sensors separately: the GPS, IMU, and temperature sensor by connecting them and collecting data. I also began training to fly the drone.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress mostly on schedule except I tested the other sensors separately without sensor fusion with the radar data, which I will do next week.

What deliverables do you hope to complete in the next week?

  • Decide on Monday what the interim demo will encompass
  • Make sure the system is able to detect humans for the interim demo
  • Collect more 3D range-doppler-azimuth data in different scenarios for more training

Angie’s Status Report for 3/18

What did you personally accomplish this week on the project?

This week, I acquired the DCA1000EVM board and tested it with the AWR1843 radar, allowing real-time ADC samples to be collected easily and more versatilely processed into 3D range-azimuth-doppler maps that fit in with our ML architecture. After passing an exam, I also obtained a part 107 license to fly the drone. Although our system can be tested without a drone, our project is designed for drones and we would like to use drones for the demo. It is also important that the system can detect humans from the perspective of a drone (although stationary, it experiences more perturbations in motion even when hovering, compared to testing without a drone) looking downward at humans from several meters above, along with Doppler noise from the wind and moving objects in the environment that are not available at ground level.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress is on schedule after receiving the green board.

What deliverables do you hope to complete in the next week?

  • Sensor fusion of GPS and IMU data with radar data
  • Test methods of increasing resolution of radar data
  • Begin integrating radar with drone

Angie’s Status Report for 3/11

What did you personally accomplish this week on the project?

I met with Professor Swarun Kumar and students at CyLab the week before break to discuss and test a 120 GHz radar. Although angular and range resolution were increased, effective range was greatly decreased to about a meter for detecting humans, which is way below our use case requirements. A day later, Matthew O’Toole responded that a DCA1000EVM (the green board) was available for our use, so we are switching back to the AWR1843. With the team, I also helped generate labels of the target location in range-azimuth space for the neural network and contributed to the design report.

I set up real-time data streaming directly from the AWR1843 without the green board by directly accessing the serial data, so I collected a small range-azimuth and range-doppler dataset including me moving at different ranges and azimuths, which clearly shows in the range doppler plot even when partially obscured by 1 cm-wide metal bars spaced 5 cm apart, but very little difference in the range azimuth plot, which only plots zero-doppler returns. A visualization of part of the dataset is shown below:

However, the data available without the green board is not sufficient for our purposes due to:

  • Very low doppler resolution compared to similar studies in literature
  • Lack of localization for doppler-shifted returns, not just single points of detected objects
  • Cannot separate doppler shifts of returns at the same range but different azimuths

Now that we can use the green board, we will collect 3D range-azimuth-doppler maps that mitigate these issues and allow us to use a 3D-CNN architecture as originally intended, without the significant information loss the week before from reconstructing using just the range-doppler and range-azimuth maps which were the only available radar data from the dataset.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress is slightly behind due to uncertainty about whether the green board was available, which created uncertainty and extra work on characterizing and comparing two different modules. With the green board, I can iterate quicker and catch up due to the availability of real-time raw data.

What deliverables do you hope to complete in the next week?

  • Set up and collect higher resolution 3D data from the AWR1843 with green board
  • Test latency of streaming data from all the sensors through WiFi and adjust data rates accordingly
  • Obtain a part 107 drone license

Angie’s Status Report for 2/25

What did you personally accomplish this week on the project? 

This week I worked on and presented the design review presentation, finalized the circuit design for the attachment, and changed how the data will be sent and preprocessed. Also, the AWR1843 radar cannot stream real time data without the DCA1000EVM module, but it can record and save data, so the plan is changed to writing the data to one of two files every two seconds and streaming the other file to provide non-real time but acceptably fast results. I also discussed preprocessing the inputs to the neural network with Linsey and the format of the inputs.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I went a week ahead on the circuit schedule and a week behind on preprocessing schedule, so I am on schedule in total. My actions to catch up on the preprocessing schedule will be to work on processing the radar data into inputs to the neural network instead of working on the circuit.

What deliverables do you hope to complete in the next week?

Next week, I hope to obtain both the GPS and IMU so that I can finalize the plastic chassis design and 3D print it right afterwards. I also plan to do more preprocessing of the data and near-real time wireless streaming of radar data through the Raspberry Pi.

Angie’s Status Report for 2/18

This week, I acquired parts for the standalone attachment, including the Raspberry Pi and temperature sensor. After ordering a GPS module, all required parts for the circuit (others sourced from TechSpark and previous coursework) would have arrived and the whole circuit would be integrated. Literature also confirms that a suitable radome for patch antennas can be 3-D printed with PLA as the material, which is available at CMU. The knowledge to build the circuit was learned in 18-220. I will implement the below simple slotted radome into our 3-D printed chassis (Karthikeya et. al.)

I also confirmed that I can use the drone, but I will also build the standalone attachment for the MVP, adding time to the schedule. Due to focusing on the circuit instead of the radar data, parts in my schedule have swapped since I previously planned to build the circuit after working on processing radar data, so I am behind schedule for data processing. For the next week, I plan to finish implementing real-time data collection from the radar and generating range-doppler maps as input to the neural network.

Angie’s Status Report for 2/11

After consulting with Akarsh Prabhakara from CyLab about our project, we received an AWR1843Boost radar which we could immediately test with. I also consulted Professor Sebastian Scherer from AirLab about drones to use for the project. We also received relevant literature about detecting vital signs with drone-based radar. Before capturing any data that will be used for training, I tested basic functionality by setting up an indoor scene with a 3 m by 3 m cardboard wall and recorded a moving human (me) in front and behind it. Below are the point clouds obtained using constant false alarm rate (CFAR), and Doppler-range plots clearly indicating the moving human both in front and behind the cardboard obstruction. Our progress is on schedule. Next week, I hope to finalize the drone situation, which will inform the decision of whether to order parts. I will continue to collect data and also work to isolate a human’s radar signature from moving background clutter.