Caroline’s Status Report for 4/29

This week, I helped with the unit testing for our robot and worked on materials for our final presentation and final poster. The testing plan is as described in the Team Status Report, and we are mainly observing the performance while noting failure cases for possible adjustment in the future. I also laser cut small cylinders to hold the scents for our demo. We need these cylinders to prop up the scented object so that it can be picked up by the ultrasonic sensors.

Caroline’s Status Report for 4/22

The past weeks, I worked on various improvements to the robot including modifying and reprinting the design of top and back panel, integrating two new ultrasonic sensors on the side of the robot to reduce obstacle collisions, and making various adjustments to the scent detection and confirmation logic to improve the accuracy of detection.

With regards to the robot exterior, I redesigned the top of the robot to include cutouts for the LCD screen and newly added neopixel so that all of the wiring is hidden inside the robot and it looks more finished overall. I also added vents to the back of the robot to help with airflow throughout the chamber. Lastly, I drilled two holes on the side of the robot to install the two new side ultrasonic sensors. The two ultrasonic sensors were added to help prevent the robot from running into the scented object, which occurred often while it was scanning. I initially just tried to implement logic that made the robot reverse slightly if it detected a side object during scanning and then continue its scans from the same position. Getting the robot to reverse while preserving the scan angle actually required some significant effort which my team assisted me with in debugging. While testing the obstacle detection, we also decided that it would be valuable to change the logic for obstacle avoidance during random exploration as well due to some issues with the robot getting stuck when facing an obstacle. I changed it so that if the front ultrasonic sensor is triggered, it reverses and then continues to a new random coordinate, and if a side ultrasonic sensor is triggered, it rotates 90 degrees in the opposite direction. Lastly, I changed scent detection and confirmation logic to also take into account readings from the ENS 160 TVOC channel in addition to the Grove ethanol channel. The Grove sensor has had sensitivity issues where the ethanol value will not change at all unless the scent is very close. The ENS 160 TVOC channel is more sensitive to the scents at a farther distance, but is also more prone to random spikes which poses the risk of triggering more false detections. We tested different thresholds and logic for using the two sensors and will continue to tune this if necessary.

In the upcoming week, my main focus will be assisting with the testing of the robot and making any adjustments to the robot according to our observations. We don’t anticipate to make any more major system changes — the main goal is to improve the reliability and accuracy of the scent detection for our demo.

Team Status Report for 4/22

Coming into the final stretch of our project, we dedicated this week to achieving better obstacle detection and integrating our multiclass classification algorithm. We also introduced a Neopixel LED ring light on top of the robot with various color patterns for different modes and scents that ScentBot can identify. 

We added two additional ultrasonic sensors, one on each side of the robot, and implemented new obstacle avoidance logic for when the robot detects objects on the side. We have seen a tremendous improvement in the robot not running into the scented object when exploring and scanning. Now, the robot is able to back up, and continue its scan and/or random exploration. ScentBot also now uses various sensor values to determine if a scent has been detected or confirmed which has improved the detection and confirmation performance.

We also explored utilizing propane and isobutane sprays for our third scent, as we hypothesized that substances with hydrocarbons would trigger the sensors. Upon testing with our sensor arrays, we discovered that the concentrations of TVOCs, Ethanol and other hydrocarbons was not high enough to trigger our sensors. We have decided to have our embedded Support Vector Classification (SVC) only work on the following: alcohol, paint thinner and ambient scent. We also integrated the SVC model on the Arduino Mega to only classify a scent after a global threshold has been reached. This was a decision made with considering the tradeoff between false positive rates and sensor sensitivity. This ensures that ScentBot is confident enough over a sampling period that there is a scented object. 

We have come up with a test plan and run 20 initial trials with paint thinner and alcohol, introducing either 0, 1 or 2 unscented objects in trials to observe ScentBot’s performance. On average, we found that the classification is always correct, while convergence time is around 183s with an average first scan detection at 39cm. We expect this to be more true to ScentBot’s performance as we host more trial runs, which is our goal before our final demo. We are also working on fine-tuning the front ultrasonic sensor to prevent ScentBot from running into walls.

Linked is a successful test run, similar to the ones we plan to showcase at our final demonstration.

Caroline’s Status Report for 4/8

During the first half of this week, I helped test and refine our robot in preparation for the interim demo. My team tested many different locations to hold the demo and tuned hyper parameters to improve the ability of the robot to track down a scent. After the demo, I switched out the Arduino Uno in our robot to an Arduino Mega. This solved the memory issues we were facing and we are now able to read from multiple sensors at a time. This means that we can now simultaneously measure the ethanol values in addition to CO2, CO, and get temperature and humidity readings. We tested the new sensor array with both incense smoke and isopropyl alcohol and found that these two scents cause all of the sensor channels to rise. This is problematic because it means that we are unable to differentiate scents just by thresholding different sensor channels. Because of this, we are switching to using an ML model to try to distinguish between scents. Moving forwards, I will help with improving this model and testing the robot. Once the classification part is integrated, our project focus is more on improving the existing setup.

In terms of verification and validation, my team has done lots of testing to optimize the robot hyper parameters and testing setup. We have placed the robot in multiple different environments and observed whether or not it detected and track down a scented cotton ball placed in the environment. We discovered that the robot is not able to move on certain uneven surfaces and in rooms where there is strong air conditioning — in these cases, the robot struggled to move or would not move in the correct direction towards the scent. Moving forwards, we will continue adjusting the robot and testing setup to ensure that it can consistently meet the design requirements of being able to track down and correctly identify a scent. We will keep track of metrics such as the time the robot takes to identify a scent and the scent classification accuracy.

Caroline’s Status Report for 4/1

This week I worked on implementing and testing the “targeted search” algorithm for our robot, where it tries to track down the direction of a scent upon detection. Initially, I just used a simple thresholded sensor value to trigger the “targeted search” mode. Once it enters this mode, the robot stops in place and begins sampling the sensor values at different angles of rotation. It then proceeds in the direction at which the maximum sensor value was taken. It confirmed the location of a scent once the sensor value exceeded another, higher threshold. With perfect sensors, this algorithm should work in theory, but we have had to strategize and adjust due to noise and inconsistencies in the sensor readings. We tested different ranges of rotations and looking at the slope of the best fit line to more reliably trigger the search mode and identify the direction of the scented object, but the results are still not very consistent. We will try using a fan in order to amplify the distribution of the scent in preparation for the interim demo. Additionally, I integrated the ultrasonic sensor into the code, so now the robot stops and turns around if it gets too close to an obstacle. Before the interim demo, we need to build a barrier around the arena to prevent the robot from going out of bounds. We will also start testing the sensor sensitivity to paint thinner so that we can demo the robot detecting multiple scents.

Team Status Report for 3/25

This week, we worked on robot construction and integration of our sensor and motion subsystems. In terms of robot assembly, we began soldering connections onto a new protoboard and organizing all of the internal connections. We also glued the robot frame together and were finally successful in securing the motors to the chassis using wood glue. This greatly improved the stability of the wheels and increased the accuracy of the robot’s motion.  In terms of integration, we finally wired together all of our subsystems so that our sensors and motors are connected and controlled by the same MCU. When we combined all of our systems together, we realized that the 9V battery was not enough to power every component. Therefore, we connected an additional 9V battery to directly power the Arduino. We also worked on software integration, which involved combining the motion logic and sensor logic. We were able to successfully take sensor readings at a specified sampling frequency while also issuing motor commands. We also added random path planning to determine the course of the robot instead of using predetermined coordinates. 

However, we also identified several issues while integrating our subsystems. The biggest issue is that our sketch takes up too much space which causes stability issues to occur and does not leave enough memory available for local variables. Because of this, we are unable to establish the wireless connection required to send the sensor data to a local machine. Due to this, we will need to upgrade to an Arduino Mega so that we have more flash memory. Till we receive the new parts, we will not be able to continue with our system integration which might set us back by a few days.

The wi-fi subsystem is also presenting multiple issues with setting TCP/IP connection protocols with our web server because of the stability and memory issues. In general, the chip sends raw data strings, so the program needs to account for a response status, and user header metadata in order to be perceived as a proper response for the machine learning model to retrieve it. Doing this across several sensors with multiple data streams is a big risk to mitigate for the team moving forward. Working on integrating these systems, as described above, is going to set us back a few days.

Moving forwards, we will need to define our expectations for the interim demo and work on refining individual subsystems.

Caroline’s Status Report for 3/25

This week, I worked more on robot assembly and assisting with the integration of our different subsystems. I glued the components of the robot together and the motors are finally secure which has improved the accuracy of the robots movement. While working on integration of the sensor and motion systems, I helped test and debug when we ran into problems. We ran a few informal tests with the ENS160 sensor by thresholding the ethanol reading and were successful in getting the robot to stop when alcohol was placed around 6 inches away from the sensor. Initially, our goal with integration was to see if we could let the robot run while sampling and collecting data, but we ran into unexpected issues due to Arduino hardware constraints. Our program was too memory intensive to run on the Arduino, so we were unable to transmit the sensor data wirelessly. I ordered an Arduino Mega and another Grove shield and we will have to replace our old Arduino Uno as soon as the part arrives. In the meantime, we can still collect data through a wired connection, but our priority is more focused on integration and preparing for the interim demo. Potentially we will use a more simple thresholding method for scent detection instead of a classification algorithm for our interim demo, so robust dataset collection is not the number one priority at this stage. We will need to define as soon as possible what we aim to accomplish before the demo and form our plan for next week accordingly. 

Caroline’s Status Report for 3/18

This week, I tested a new method with glue to secure the motors to the robot. This method worked decently during our initial testing, but came undone with more usage. As a more permanent solution, I am looking at some options for tube or u-shaped brackets and will place an order this weekend. I also helped out with testing the motor control code that Aditti set up this week. We ran simple unit tests which would instruct the robot to navigate to a simple series of points such as (0, 0) -> (0, 1) -> (1, 1). Due to slight differences in the motors or other external factors, the robot is not always able to reach the target points. I wrote some additional code that helps the robot correct it’s course if it strays too far off of the intended path. I will continue helping to test the robot motion next week. Dataset generation is also my priority for next week. Now that we have the data pipeline set up and all of the necessary materials, I will start on this as soon as possible.

Team Status Report for 3/18

This week we focused on motor control and data communication between the sensors and the local machine. While working on the code to fine-tune the rotation and translation motion of the robot, we identified several problems and potential risks. Firstly, the error from odometry and encoder readings seems to accumulate very quickly which sometimes causes the robot to overshoot and not converge to the target position. To account for this, we are planning to reset the encoder readings periodically once the robot reaches a predetermined target coordinate on the global map. Additionally, we added a self-correction mechanism in case of overshoot. Another issue is that most of our monitoring and evaluation of the robot coordinates and encoder readings happens through the serial port, which requires a tethered connection that affects the robot’s motion. We’ve also identified an issue in the hardware – our design would require the motors to be more secure than what we have currently, as compensating for hardware limitations in the codebase is not enough. 

We decided to move away from using the cloud and instead process the sensor data on a local server instead. This is primarily due to the incompatibility of our wifi chip with the Azure cloud. In our experiments, we realized that accessing data on the localhost does not incur too many additional costs. 

This week we also set up the pipeline for recording sensor data through serial communication to a CSV. We also acquired 99% ethyl alcohol and spray bottles. Now, we have all of the setup that we need to start dataset generation and we will start collecting data as soon as possible. At the minimum we need to get baseline readings for one scent, such as alcohol, so that we can start testing out the robot’s ability to actually track down a scent. 

Caroline’s Status Report for 3/11

Last week, I worked on testing the motors when assembled in the robot chassis. After installing new fasteners for the motors, the range of motion of motors was reduced and the wheels were able to support the robots weight. However, there was still wiggle room which allowed the wheels to become a bit unstable. Because of this, the robot moved somewhat unpredictably when directed to move in a straight line. When testing, the robot often curved slightly to the left which we hypothesize is due to a small, expected miscallibration of the motors as well. Next week, I will try securing the motors with a stronger fastener and test the motion again to determine if further tuning is necessary. I will also help work on the sensor system assembly. One key concern we have is how sensitive the sensors will be when installed inside our robot, and determining the maximum distance at which scents can be detected with our testing setup. After more experimentation with the sensor sensitivity next week, I will determine if any modifications need to be made to the robot to help increase the range at which scents can be detected.