Alvin – 2/27/21

Group:

I worked with my teammates to build a proposal presentation as well as Gantt chart to describe our semester schedule and general task division. We also worked together to pick out materials to buy and materials that are already owned.

Personal:

On my own specific task, I finished setting up the AirSim quadrotor simulator and familiarized with the API. The API provides both C++ and Python interface, but I will stick with Python since Computer Vision and various trajectory optimization packages use Python. I chose Airsim because it runs accurate nonlinear dynamics under the hood. Airsim can also simulate windy weather, which is important in testing the robustness of our motion planning and controls. Airsim most importantly interfaces with the Px4 Flight Controller API, which is exactly what is running on our real drone. This means that all my motion planning and controls can be tested and developed in simulation, and should run smoothly on the drone without any modification.

Next, I verified that one of our wifi modules functions properly by connecting it to a Raspberry Pi and accessing the internet. I then also double-checked that wifi can support communication between the ground compute and the drone through ROS . I set up my laptop as the “Master” that maintains ROS’s Publisher-Subscriber system, and the Raspberry Pi as one node. Because both devices were connected to my house’s wifi, as long as the nodes have access to the Master’s local IP address, they can communicate seamlessly. To demonstrate functionality, I had the Raspberry Pi publish a “Hello World” string and verified that the laptop could receive these messages over the network. This was just a proof-of-concept, and Sid will be handling the specifics of the code and software for the communications as well as compare bandwidth with other platforms.

Next Steps:

Overall, our team and my individual progress is on schedule. I’ve already shipped the drone from my house and will receive within the next few days, so the others can begin prototyping hardware on the drone. Since I can do all my testing on Airsim initially, this upcoming week, I will begin building our software pipeline for motion planning by assuming that I am given some predicted future trajectory of 3D positions of the human target. I will develop an initial implementation of trajectory generation and verify this works in simulation. In the meantime, we will prepare for our design presentation.

Vedant’s Status Report 2/27/21

Last Sunday we confirmed our  Gantt chart and schedule. Then we divided up the components we planned on using to research so we could place an order . I researched the display we would use and how we can integrate the start/stop button with the TX1. For the display, I concluded we should buy a small 5 inch HDMI display from Adafruit.

The display can be powered by micro USB. The TX1 has an HDMI port and a micro USB that can supply the 500 mA current required by the display. This display was ideal as it is small, so it can be something we can wear on our arm, and requires no custom power management circuity as it can be powered through USB. Since right now we are planning to place the TX1 in a backpack and display on the arm, it is important to minimize the wiring between the two components.

Next, I created a simple circuitry to connect our start/stop button to the TX1.

The J21 header pin 13 on TX1 has a pull down network which will complete the circuit when the button is pressed. The 3.3V is provided by the TX1 GPIO pin 1. I initially plan to have this circuitry on a small breadboard next to the display for testing and later transitioning to through hole assembly. I plan to get all the components for the button circuitry from the ECE lab.

I also researched a JTAG debugger we can use for the TX1 as the kit we have did not come with a JTAG debugger.

Next week I hope to start researching on the color/blob filtering. I want to get a basic filter implemented on the TX1 using the built in camera to detect a bright orange sticky note. Since we do not have the camera we will use on the drone, I am using the TX1 built in camera as the processing will be done by the TX1 in any case. I also plan to help my team place the order for some of our critical components like the camera and display.

Team’s Status Report 2/27/21

At the start of this week (Sunday), we finalized our Gantt Chart and schedule. On Monday, we presented our proposal. We got some feedback from Professor Savvides that we should use a computer with more cores than the Jetson Nano . Since we owned a Jetson TX1 which has a 256-core GPU vs the 128-core GPU on Nano, we decided to switch to the TX1.

After this, we broke up the components of our design and assigned each team member certain parts and components to research for ordering. This breakdown was performed in the “Design Research” document in the shared folder. Finally, after each member had researched the desired components, we compiled all the components we need along with their provider and cost in the “Bill of Materials” sheet within the shared folder. In addition to this, we tested the drone simulation and flight controller API along with making sure the Raspberry Pi could send messages through ROS via WiFi. We also set up Solidworks for CAD design.

For next week, we will first order all the components labelled “ASAP” by Tuesday so that we can hopefully receive them from Quinn by the end of the week. In addition to this, we will set up the TX1 and start programming it, working on color filtering and blob detection using the TX1’s built-in camera and researching methods for target state estimation. In addition, we will create scripts that convey simple motion commands to the drone and test if it works in simulation. We will also test the bandwidth limitations of streaming video from a Raspberry Pi to a laptop over WiFi. Finally, we will begin the CAD process for the housing for our various parts.

Siddesh’s Status Report- 2/27/21

Last Sunday, our team finalized the Gantt Chart and schedule for the project. On Monday, I presented the proposal. After that, we worked together to divide up which components each of us would focus on researching for ordering. I researched batteries for powering the TX1 and concluded we should buy a 3S LiPo battery along with a charger, a charging bag and an XT-60 male to 2.5 mm barrel connector for connecting the battery to the TX1. In addition, I researched the type of camera we would need, using this calculator to figure out the width and height of the frame in feet given the camera’s focal length and distance to target. I found that a camera with a focal length around 4mm would give about a 20′ by 30′ frame when 20′ above the target. At a resolution of at least 720p, a back of the envelope calculation shows that this should provide enough pixels of the target for CV detection. n addition, the smaller focal length means that the camera itself is a lot more study.  Under these specifications, I found this camera that is meant to be connected to a Raspberry Pi, has the desired focal length and can provide 720p 60fps video (we only require 720p 30fps). Finally, in addition to researching these parts, I studied the datasheet of the TX1. I also downloaded Solidworks onto my computer and started learning how to use Solidworks for CAD design.

I wasn’t able to start testing the TX1 this week since I needed a monitor, keyboard and mouse (which I am acquiring today). Thus, for next week, I aim to fully set up the TX1 and create some sample programs (such as testing out Wi-Fi communication capability). I will also download and configure ROS on the TX1. In addition to this, I will start researching methods for performing target state estimation and begin the CAD process for the housing for the TX1, display and the camera / Raspberry Pi.

Siddesh’s Status Report- 2/20/21

I helped to write a new abstract for both our Bluetooth triangulation idea and our newly refined drone idea and looked up what specific technologies we could use for implementation. For the Bluetooth triangulation idea, I helped to identify SOCs that could support direction finding and helped email a PhD student for more information on this. For the drone idea, I helped solidify the requirements for our MVP and decide what the exact deliverables for our project were. After we decided on our final idea, I worked with the team to figure out how to divide the tasks and planned when we wanted to get each task done. I also worked on the presentation with the team. For next week, I will begin familiarizing myself with computer vision techniques for identifying people, and help create the schedule. I will also figure out what parts to order with the rest of the team.

Team’s Status Report 2/20/21

This week, we spent a significant amount of time debating between our final drone idea and a bluetooth network-based localization system. During our first meeting with Professor Savvides, we asked him for guidance on how to choose between these projects and how he felt about them. We decided to write two entire abstracts to fully consider available resources and areas of expertise. After fleshing out both these ideas, we were excited to choose our drone idea. From here, we worked on the presentation. We also started brainstorming on parts involved and how our choice of parts would impact our budget.

Vedant’s Status Report for 2/20/21

I helped write and discuss the abstract for all our ideas (bluetooth triangulation and drone tracking system) to aid us in mapping out the feasibility of each. For both of them, I worked with the team to identify the challenges that would be involved and how to solve them (like finding if there is an existing API for bluetooth angle of arrival for BLE direction finding). This information allowed us to understand if the drone or bluetooth localization problem was more feasible. After we picked our final project, I helped brainstorm the goals that we can achieve for the MVP and what we can aim for as stretch goals. Finally, I am also working on creating the proposal presentation. Next week, I will start researching the microcontroller and on ground compute that we can use. I will also work on the schedule. I will also work with the team to determine how we can mount our extra hardware on the drone.

Alvin’s Status – 02/20/21

I helped contribute to project idea discussions and abstracts for each of our ideas. I helped brainstorm which goals on the robotics/AI side would be feasible for an MVP and which would be high risk but high reward. I also helped build the presentation slide deck. Next week, I will begin looking at open source drone simulators and revisit previously written code for drone trajectory optimization. I’ll work with the team to decide what components to purchase.