Ayesha’s Status Report for 4/1

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficienteffort into the project over the course of the week (12+ hours).

This week I spent a lot of time reading into the REST framework to integrate the web app with the ML algorithm. I installed the framework into the web application and have been working on moving my files over to the ece machines, since that is the only place with enough storage to run the ML algorithm. Understanding how the framework would allow the two software portions to integrate took a lot of time because there was a very specific structure to follow and I had to make sure this framework would work with the model we implemented. I also worked with my team to gather data for training the ML algorithm and testing the radar image capture functionality. We met up and attached a swiffer to our radar so that we could hold it up at heights of 5ish meters, and the goal was to test the capture abilities of the radar while one of us laid on the ground and waved our arms to be detected by the Doppler shift. The radar unfortunately not connect to the computer, so we will instead meet on Sunday to redo this using the radar that is with Angie.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Personally, my progress is on schedule but I need to spend more time helping gather data so that I can begin integrating, because otherwise all of the parts and the integration will fall behind.

What deliverables do you hope to complete in the next week?

In the next week, I hope to gather more radar data so that the machine learning algorithm can train better. I also hope to finish integrating the ML with the web app so that the software can work cohesively. Lastly, this is a stretch goal but if all goes well with capturing data, I hope to work with Angie on getting the GPS data so that I can start working on the marker display functionality, since I have to hardcode that until I can get the input data from somewhere.

Team Status Report for 3/25

What are the most significant risks that could jeopardize the success of theproject? How are these risks being managed? What contingency plans are ready?

As we begin to test our system more extensively, significant risks that could jeopardize the success of the project include damage to the system during testing, which is especially risky for our project since we use borrowed components that cost over $1000 in total. Even if the components are not exposed to high temperatures except the temperature sensor, environmental conditions may hasten damage such as corrosion to the antenna, which is visible on the AWR1642 radar module that came with the green board. To prevent the same from happening to the AWR1843, we have chosen to enclose the system with a radome when testing in high-moisture conditions such as fog. The contingency plan is that we have two radar modules available in case one fails.

Another risk is the dataset not being sufficient to train the neural network to detect a human. Right now, the neural network has only been trained on the publicly available Smart Robot dataset that detects a corner reflector, which has a very different radar signature compared to a human. To mitigate this risk, our contingency plan is to train the neural network on our own dataset of 3D range-doppler-azimuth data of actual humans that is continually collected throughout the course of the semester.

Were any changes made to the existing design of the system (requirements,block diagram, system spec, etc)? 

No changes were made this week to the existing design of the system.

Angie’s Status Report for 3/25

What did you personally accomplish this week on the project?

This week, I tested integration of the Raspberry Pi with the other peripheral sensors separately: the GPS, IMU, and temperature sensor by connecting them and collecting data. I also began training to fly the drone.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress mostly on schedule except I tested the other sensors separately without sensor fusion with the radar data, which I will do next week.

What deliverables do you hope to complete in the next week?

  • Decide on Monday what the interim demo will encompass
  • Make sure the system is able to detect humans for the interim demo
  • Collect more 3D range-doppler-azimuth data in different scenarios for more training

Ayesha’s Status Report for 3/25

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficienteffort into the project over the course of the week (12+ hours)

This week I accomplished everything I wanted to. I integrated the HERE Maps API into my project and tested the map display and the marker addition functionalities. I spent a lot of time reading into how HERE works and how to use the different features, as well as adjusting it to the code I already had. I also implemented the zoom and scroll features on the map. Here is a picture to display the map with an example marker. In addition, I spent a lot of time adjusting the style to make sure that the map fit in the layout I had set up on the website.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

My progress is now on schedule, which is great.

 

What deliverables do you hope to complete in the next week?

In the next week, I hope to integrate the web app with the ML portion so that the detection can be displayed where the blue box currently is.

Linsey’s Status Report for 3/25

Although I spent a lot of time on the project this week, I am frustrated, because I didn’t make any progress. After trying many times unsuccessfully to migrate my code to AWS, I realized that for it to work I would have to use a tier that requires money. I then learned that this course doesn’t provide AWS credits. Therefore, Tamal pointed me towards the ECE machines and lab computers. I first tried the ECE machines, because I can ssh into them remotely. I accomplished this. However, the size of the data I am working with is too large for the andrew directories, so it is necessary to use the AFS directories. After reading the guide online, I wasn’t able to successfully change directories. I emailed IT services, and they responded with some help that didn’t work for me, and they didn’t respond to my most recent follow-up, which leaves me with the option of working in person on the lab computers. Tomorrow, I plan on trying this option.

My progress is behind. This is not at all where I planned on being. By working in the lab tomorrow, I plan on getting the architecture training and finally catching up.

This week, my deliverables will include a trained architecture and integration with at least one of the subsystems (either the frontend of the hardware).

 

Angie’s Status Report for 3/18

What did you personally accomplish this week on the project?

This week, I acquired the DCA1000EVM board and tested it with the AWR1843 radar, allowing real-time ADC samples to be collected easily and more versatilely processed into 3D range-azimuth-doppler maps that fit in with our ML architecture. After passing an exam, I also obtained a part 107 license to fly the drone. Although our system can be tested without a drone, our project is designed for drones and we would like to use drones for the demo. It is also important that the system can detect humans from the perspective of a drone (although stationary, it experiences more perturbations in motion even when hovering, compared to testing without a drone) looking downward at humans from several meters above, along with Doppler noise from the wind and moving objects in the environment that are not available at ground level.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress is on schedule after receiving the green board.

What deliverables do you hope to complete in the next week?

  • Sensor fusion of GPS and IMU data with radar data
  • Test methods of increasing resolution of radar data
  • Begin integrating radar with drone

Linsey’s Status Report for 3/18

This week I worked to migrate my code to AWS. Previously, making the training and target datasets locally was working fine. However, when I tried to concatenate those two datasets and split them into training and validation sets, it stopped working locally due to running out of memory. Therefore, I migrated everything to AWS to overcome those issues. However, this has proven much more difficult than I thought. I spent hours trying to fix the ssh pipeline on VSCode and installing the necessary packages in the AWS environment. Ssh was finicky and sometimes wouldn’t connect at all; fixing that was a lot of Stack Overflow and looking through Git forums. Once I got into the AWS environment, everything was successfully copied over–the code files and the data. However, every time I tried to install PyTorch on the AWS environment, it would disconnect me. Additionally, once it logged me out of the ssh window, it wouldn’t let me back in a second time. I’ve looked at many pages for this and am still struggling to get this to work.

My progress is behind. I am very frustrated by the AWS migration. I hope I can figure this out soon, because after that, running and training the architecture will be very simple.

By the end of the week, I hope to have successfully run the architecture on AWS and started integration with either the web application or radar.

Team Status Report for 3/18

What are the most significant risks that could jeopardize the success of theproject? How are these risks being managed? What contingency plans are ready?

The most significant risks that we currently have are from image capturing. This process will take the longest because there are a lot of parameters to tune with image capture, and all of the adjustments that need to be made will add some time. The risk that comes with this is delaying our integration. Specifically, we need the radar images to test our machine learning algorithm and be able to start attempting human detection. This risk is being managed by running the training and all other parts of the project in parallel with the image capture. The software components are being developed right now so that they are fully ready by the time the images are ready for training. Switching the radar is one contingency that is in place to help capture images better.

 

Were any changes made to the existing design of the system (requirements,block diagram, system spec, etc)? Why was this change necessary, what costsdoes the change incur, and how will these costs be mitigated going forward?

As mentioned in Angie’s status report from last week, the main change that was made to the existing design was the radar. We have switched back to the original radar since that became available from someone in CyLab that Angie had been communicating with. They left a green board for us, which was very helpful, and helped steer our design choice. Another design element that was changed was the maps API. Instead of using Google Maps API, we will be using HERE Maps API because there were too many payment issues with the Google Maps API. HERE has the same marker functionality that we wanted from Google Maps so this is still a very usable API for us. Also, it has been used by past projects so we know it is doable. The last change is that we are deploying the machine learning algorithm on EC2 instead of locally because there were space issues. There are no extra costs incurred.

 

Provide an updated schedule if changes have occurred.

No changes have occurred.

Ayesha’s Status Report for 3/18

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficienteffort into the project over the course of the week (12+ hours)

This week I worked on four main things. I worked on the ethics assignment and evaluated different features that could be added into our project. I also communicated with many different people about how to get reimbursed for the google maps API and looked into that. This took a really long time because I had a lot of back and forth with people, and I also was later told that I could not pursue a lot of the options I was looking into. This was what I had spend majority of my time on. The third thing I worked on was looking into HERE Maps API which the professors suggested to me. After compiling a bunch of resources that could be useful for HERE, I made my free account. Throughout all of this, I also worked on making tweaks to the layout of the site. Specifically, I set up exactly what sections I would want to display map images or information so that I could have everything formatted perfectly once I start using the API and not have to waste time readjusting that.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

I am less behind than I was last week since I was finally able to resolve the payment issues for the API. By next week, I should be able to be completely caught up by implementing the API basics and testing map functionality.

What deliverables do you hope to complete in the next week?

In the next week, I will incorporate the API and test the marker functionality. This is the main deliverable I want to complete to understand how to API works and get familiar with adding markers, since this will be the main function I have to use when I receive the GPS data.

Angie’s Status Report for 3/11

What did you personally accomplish this week on the project?

I met with Professor Swarun Kumar and students at CyLab the week before break to discuss and test a 120 GHz radar. Although angular and range resolution were increased, effective range was greatly decreased to about a meter for detecting humans, which is way below our use case requirements. A day later, Matthew O’Toole responded that a DCA1000EVM (the green board) was available for our use, so we are switching back to the AWR1843. With the team, I also helped generate labels of the target location in range-azimuth space for the neural network and contributed to the design report.

I set up real-time data streaming directly from the AWR1843 without the green board by directly accessing the serial data, so I collected a small range-azimuth and range-doppler dataset including me moving at different ranges and azimuths, which clearly shows in the range doppler plot even when partially obscured by 1 cm-wide metal bars spaced 5 cm apart, but very little difference in the range azimuth plot, which only plots zero-doppler returns. A visualization of part of the dataset is shown below:

However, the data available without the green board is not sufficient for our purposes due to:

  • Very low doppler resolution compared to similar studies in literature
  • Lack of localization for doppler-shifted returns, not just single points of detected objects
  • Cannot separate doppler shifts of returns at the same range but different azimuths

Now that we can use the green board, we will collect 3D range-azimuth-doppler maps that mitigate these issues and allow us to use a 3D-CNN architecture as originally intended, without the significant information loss the week before from reconstructing using just the range-doppler and range-azimuth maps which were the only available radar data from the dataset.

Is your progress on schedule or behind? If you are behind, what actions will betaken to catch up to the project schedule?

Progress is slightly behind due to uncertainty about whether the green board was available, which created uncertainty and extra work on characterizing and comparing two different modules. With the green board, I can iterate quicker and catch up due to the availability of real-time raw data.

What deliverables do you hope to complete in the next week?

  • Set up and collect higher resolution 3D data from the AWR1843 with green board
  • Test latency of streaming data from all the sensors through WiFi and adjust data rates accordingly
  • Obtain a part 107 drone license