Team Status Report

We achieve many milestones in these two weeks:

  • Successfully conducted manual test flights with a remote control. This means we fixed our connection issues with the drone.
  • Made automated scripts that launch on startup for the raspberry pi. This includes automatically starting a video stream and publishing images over ROS as well as automatically shutting down the video stream with a simple ROS message
  • Created draft user interface with buttons on breadboard to change drone flight modes. This involves programming interrupts on the Jetson TX1 to trigger the sending of ROS messages to the drone.
  • Tested a new camera, which finally works as expected, and calibrated both the color filters for target detection and the intrinsic parameters with a chessboard.
  • Implemented two different motion planners and verified performance in simulation.
  • Successfully controlled the drone fully with code today. The code only told the drone to fly up to a certain height, but this means we fixed all other communication issues and verified that we can both stream commands to the drone and receive pose updates from the drone fast enough.

TODOs:

  • We need some more buttons and a longer cable connecting the TX1 to our user interface board. We then need to integrate this with sending ROS messages.
  • Integrate motion planning with drone
  • Test and verify 3D target estimation on the real system, not just simulation.

Team’s Status report 4/10/21

This week we focused on getting all the remaining parts integrated for our test flight. We completed the calibration and setup of the firmware for all the sensors. We also got the streaming to work. This allowed us to have our first test flight which will serve as our interim demo.

On target, next week, we plan to improve on some of the issues we saw with our test flight the biggest being some latency issues while streaming. We also have to improve on our communication with the drone via the button which is having some issues being streamed via ROS.

Team’s Status Report 3/27/21

This week, our tasks were more related to integration, but still independent. Vedant and Sid were able to integrate target detection and state estimation and demonstrated impressive smoothing of the target location in 2D pixel space. Vedant also focused heavily on debugging issues with our new camera on the software side, while Sid developed CAD parts to attach this specific camera and other sensors to the drone. Alvin integrated both Vedant and Sid’s target detection and state estimation pipelines into a functional 2D to 3D target state estimation.

Our stretch goal was to begin physical testing this week, but we will push this to next week since we have faced unexpectedly long lead times for the 3D printing and connection issues with the camera. Once we can integrate all this hardware with the drone, we will begin physical testing.

Team’s Status Report 3/13/21

We continued to work on our individual tasks this week. We have a completed computer vision algorithm that has been tested (still need to run on TX1 though), and state estimation algorithm. Further, we have the drone simulator working which allows us to test the state estimation algorithm first to see how the drone reacts before we physically fly it.

Next week, we plan to test the state estimation algorithm with the test video used for the computer vision. We also plan on focusing on the external interfaces like designing the camera mounting and getting the drone to fly an arbitrary path in simulation and physically. All these targets will help us get closer to integrating the drone with the computer vision and state estimation algorithm. We also plan on making the button circuitry for the wearable.

Team’s Status Report 2/27/21

At the start of this week (Sunday), we finalized our Gantt Chart and schedule. On Monday, we presented our proposal. We got some feedback from Professor Savvides that we should use a computer with more cores than the Jetson Nano . Since we owned a Jetson TX1 which has a 256-core GPU vs the 128-core GPU on Nano, we decided to switch to the TX1.

After this, we broke up the components of our design and assigned each team member certain parts and components to research for ordering. This breakdown was performed in the “Design Research” document in the shared folder. Finally, after each member had researched the desired components, we compiled all the components we need along with their provider and cost in the “Bill of Materials” sheet within the shared folder. In addition to this, we tested the drone simulation and flight controller API along with making sure the Raspberry Pi could send messages through ROS via WiFi. We also set up Solidworks for CAD design.

For next week, we will first order all the components labelled “ASAP” by Tuesday so that we can hopefully receive them from Quinn by the end of the week. In addition to this, we will set up the TX1 and start programming it, working on color filtering and blob detection using the TX1’s built-in camera and researching methods for target state estimation. In addition, we will create scripts that convey simple motion commands to the drone and test if it works in simulation. We will also test the bandwidth limitations of streaming video from a Raspberry Pi to a laptop over WiFi. Finally, we will begin the CAD process for the housing for our various parts.

Team’s Status Report 2/20/21

This week, we spent a significant amount of time debating between our final drone idea and a bluetooth network-based localization system. During our first meeting with Professor Savvides, we asked him for guidance on how to choose between these projects and how he felt about them. We decided to write two entire abstracts to fully consider available resources and areas of expertise. After fleshing out both these ideas, we were excited to choose our drone idea. From here, we worked on the presentation. We also started brainstorming on parts involved and how our choice of parts would impact our budget.