Cole’s Status Report for Apr. 12

This week I started making the box that will hold all the components for our robot. I could not find a free version with shelves so I had to CAD those myself. The idea is that we can organize the components so that the things that are seperate systems from interfering with each other.

I also helped to start decouple the machine-soldered pins on the motor driver so that we can make the connections permanent as we start to finalize the robot.

This coming week I will start to solder the boards that connect the lipos that will power the electronics and the robot itself.  I will also be adding the heat shielding where we can to minimize damage to the lipos.

 

Kushaans’ reflection for 3/31

This week I was focused on setting up the pi for integration. This involved porting the model to it, setting up all the libraries, getting an idea of mounting, and getting some benchmarks.

I wanted to get some performance benchmarks of our CV model on the pi. Previously, I found that the latency on the model was about 4 seconds on the initial pass, which was too slow for our uses. However, when running it in a loop, it seems to be at around .8-1s per frame, which is exactly what we were looking for. Seeing the anomaly on a cold start is good, because we don’t have to switch to a jetson.

 

The libraries were being weird, and I ended up needing to reflash the Pi OS, after which all the image taking software worked well. I also hooked up some decent debugging tools, so we can see the (mis)predictions correctly.

For mounting, we discovered some of the issues with the Pi and batteries, particularly I found that we need the camera to be pretty high up and the ribbon connector means the pi will probably be near the top of the housing.

Team Status Report for Mar 29

This week we started working on integration and trying to see what angles puts out the fire the best.

This upcoming week we have our Interim Demo where we plan to show each individual aspect of our project working on its own. We are planning to first integrate the raspberry pi with our ESP32 signal generator to make sure we can toggle the speaker output.

We also are going to build a mount for the speaker with a locked hinge and testing which angle works the most effectivly.

We will also be taking in any suggestions we make get from instructors during the Interim Demo.

Cole’s Status Report for Mar 29

This week I helped to set up for the demo. For me personally, this means making sure the robot works with the raspberry pi and that the collimator works with the speaker to put out the largest fire possible.

I also made a CAD model for the chassis of our robot to make it easier to visualize how we are going to mount the speaker on the robot and how we are going to add the box of other components to the other side of the chassis to counterbalance the weight of the speaker.

Going forward I will start laser cutting a box to store the components and start heat shielding it to make sure the LiPo’s are safe while we conduct the fire tests.

Steph’s Status Report for March 22nd

This week I helped construct the preliminary collimator. With the collimator, we were able to put out the candle fires successfully. The issue with using this collimator for future use however, is that it was made with paper and duct taped together. This prevented us from moving the apparatus too close to the fire due to the paper’s susceptibility to catching on fire.

Moving forward, the idea is to cut down the length of the collimator to around a foot or less give or take. It will be made out of the same material as the material used to put posters inside which is a flexible form of cardboard. This way, it is able to  maintain shape easier, prevent leak of power, and is less susceptible to fire. It will also be wrapped in steel malleable sheets which I hope to use as soon as it gets in.

Along with the collimator, I will be helping CAD parts of the chassis as well as mounting the speaker onto the robot.

Team Status Report for Mar 22.

This week we had a working fire test!!!

Using the preliminary collimator we made out of duct tape and cardstock we were able to extinguish 3 candles at once. We were a little concerned about this test as the collimator was made of a flammable material so the test was conducted at a further distance than we originally planned. Not only that, the collimator itself was longer than we have planned for our final design. This indicates that with our final design being shorter and more fire-safe we can likely put our significantly larger fires than 3 candles at once.

This coming week we plan to start integrating the basics of out project to start preparing for our interim demo. We hope to have the demo be us extinguishing fire from the speaker mounted in some non-final form on the robot which we drive with a keyboard.

 

Cole’s Status Report for Mar 22

This week I helped to complete the robot construction. Using the L298N motor controllers we used the GPIO Pins on the Raspberry Ri to drive the four motors. Our end goal is to have the computer vision model talking to the functions we wrote to go forward, backward, left, and right, but at the moment we used keyboard inputs to decide which way the robot was driving

 

This week I also helped to construct the preliminary collimator that will be used to conduct our first successful fire test. This was a great success and worked better than we could have hoped it would.

 

This coming week I will be measuring the chassis we ordered to get a 3D model so we can start building a housing unit and a mount for the speaker.

Kushaan’s status report for 3/22

This week I was focused on the controls for the robot. We first had to hook up the motor drivers for the chassis kit before we could do any testing. Essentially, the motor drivers would give us an abstraction between driving the motors via digital writes and PWM, allowing us to ignore some of the nasty details. For the software, I created a mapping of GPIO pins to easy to understand wheel inputs. We have 8, forward and backward and for each wheel. I experimented with the chassis until I had the mappings and controls donw for basic inputs (F/B/R/L).

 

The testing software was essentially an RC car, where it took keyboard inputs nad outputted the high/low signals. I also started writing the logic for centering the bounding boxes, and deciding the appropriate thresholds. Next week, I want to focus on creating a unified robot chassis framework for the  compute, so I can start running tests using the actual/experimental setup. This should be a big step to finishing the driving automation.

Team Status Report for March 15th

This week was a less practical week where we decided to strategize and take a step back from closely working on the robot. Instead, we wanted to further plan and make tests rather than further build upon what we have. Since it was unable to blow out a candle with the amplifier and speaker apparatus, the plan is to connect collimator and see how the project goes from there. The collimator will be tested in various forms, the first being just cardboard (in a multitude of dimensions) and other forms would include the steel mixed with different constructive material besides cardboard.

 

WE LOVE YOU PROFESSOR KIM

Kushaan’s status report for 3/15

This week, I was primarily focused on the ethics assignment. I came up with some ethical concerns for our project that I hadn’t thought of before. Mainly, I realized that the most dangerous parts of our project were the collimator and circuitry. The circuitry is high power and could shock someone and the collimator focuses the sound waves in a potentially dangerous manner if you held your head to it. I looked at methods to indicate this danger, and came up with a few ideas, such as the color of the cone (i.e. red for danger) and stickers. Using visual markers can help the most at-risk (children) from using this improperly.

 

I also did some research into a controls loop, particularly with Python multithreading and thread communication. I made some playground tasks to undersatnd how it works. I did some testing of inference, and found it to be a little slower than I had hoped. This led me to believe that I would need to either

a) multithread (so we don’t pin the CPU on inference, and can do controls in the meantime)

b) offload via network sockets to a laptop (or similar device), that has stronger dedicated hardware for inference (this can cut down the inference time to <80ms, vs. the 2-3 seconds of the RPI)

c) Look into model quantization/parallel frameworks to accelerate on-device.

d) Upgrade the hardware to a Jetson

This week, I will be assisting the rest of the team with integration (powering via LiPos and robot) and then looking further into sorting the small inference issue.