Team’s Status Report for 4/1

This week, our focus was to be prepared for our interim demos next week. Based on our meetings, we implemented a gradient best line-fit based on 10 samples taken every second using least squares. We then hardcoded a threshold value for this slope that would start the robot going into a scanning mode to detect scents. Trying this with the ENS160 sensor gave a lot of unpredictable results because the sensor values are very sensitive and inconsistent. We have instead switched to the Grove multichannel sensor’s TVOC value to detect ethanol. This proved to be more consistent, and we also increased the scan time taken once it detects a scent to account for the weaker nature of the Grove sensor. Our experimentation shows that a lag still exists when the sensor detects a scent. The way the robot calculates the angles while scanning, hence leads it to turn the wrong way because it picks up the scent several seconds after encountering it. We have several strategies to mitigate this risk with our sensors and to work around their inconsistent nature.

Having a consistent airflow behind the source helped in finding the object. We also tried utilizing air pumps directly from the object on top of the sensor, but this showed no improvement in its performance. We also discovered issues with surface tension and the wheels that were getting stuck is more due to the wheel speed. Increasing the speed has fixed our issue for now, but we are also monitoring the overall power usage of our robot and the motors while it is randomly exploring.

We integrated our code with the ultrasonic sensor, and the robot now reaches a hard stop and re-arranges itself to not run into obstacles or walls around our test arena. We are also meeting to work more on the exact pitch and scenario we want to present during our interim demo. Currently, everything on our system works locally off of a single Arduino sketch to detect ethanol-based scents. It can, in most cases with the correct airflow, begin a scan mode near the object’s location. The orientation of where it decides to localize the scent is dependent, as we mentioned earlier, on the airflow and timing of the sensors. With this and more fine-tuning for our scent-confirmed threshold, we hope to display this functionality during the interim demo.

Working on communication with our classification model has proved to be challenging. We decided to switch to using a NodeMCU to send data from the sensors to the NodeMCU, which would then parse data to a classification model and return the result. I2C communication has proved to be impossible to implement, as the NodeMCU cannot receive data from the slave Arduino and update across Wi-Fi at the same time. An alternative we thought of was to host the classification model on the NodeMCU and have it communicate through I2C or serial as they are physically tethered. However, the speed of I2C communication does not fit the high-speed control flow we have for the robot. Serial communication is the other alternative we explored, and although it is faster, we are facing issues in sending across an array of float data and receiving all the updated values on the NodeMCU. Looking past the interim demo, this is the biggest risk in our project that we are actively working to mitigate and work toward devising alternatives for.

Leave a Reply

Your email address will not be published. Required fields are marked *