Aditti’s Status Report for 04/29

This week focused more on testing and making any final tuning changes to constants. I also focused on the project’s documentation and worked on the final report and poster before the demo. We now have a concrete testing plan thanks to Eshita and will be sticking to it for the rest of the week. Apart from some testing during the earlier part of the week, I had a slow week as I was sick.

Aditti’s Status Report for 04/22

Aditti’s Status Report for 4/08

This week focused on testing and verification with one scent for the interim demo in the first half and making progress to accommodate multiple scents in the second half. During the first half of the week, I worked on testing our setup on multiple surfaces to ensure that the motors could carry the load at desired speeds without locking up, and also in rooms with different ventilation to make sure that there was not too much interference with our sensor readings due to fluid dynamics. Once we found a suitable place, we focused on tuning the constants to be able to detect alcohol decently well. Unfortunately, we later realized that the sensitivity of our sensors decreases as the ambient temperature increases, which caused some issues with testing. After the demo, I worked on setting up the field for our robot to operate in. We constructed a 2m x 2m space walled using cardboard. I also set up the code to read from all different sensors after transitioning over to the Arduino Mega board. Caroline and I also worked on some preliminary testing with smoke from incense and realized that most scents cause all sensor readings to go up and it is difficult to differentiate between scents based on simple data processing. We worked on dataset generation to train and deploy a neural network on the Mega using TinyML. I helped collect data for paint thinner and isopropyl alcohol, and will be working on training the model over the weekend. Using machine learning should also help us account for the differences due to temperature variations. 

So far, we have done comprehensive testing for one scent (alcohol) using our slope-based detection approach in different environments and noted the factors that influence our readings, which include airflow due to ventilation, surface texture, air flow due to movement around the test field, temperature, and sensor warm-up time. We can also do basic obstacle avoidance based on ultrasonic sensor measurements. Moving forward, we will be integrating new approaches (machine learning based) to see if we can expand the project to meet our use-case requirement of being able to differentiate between multiple scents with good accuracy. We will stick to our new field setup and will run tests in both techspark as well as the UC gym to make sure we can prepare for the final demo. We are currently meeting our budget requirements and accessibility requirements. In the coming week, I will work on collecting more data for training and preprocessing it to generate quality feature vectors. I will also work on deploying the model on the Arduino and integrating it with our motor control code. 

Aditti’s Status Report for 4/01

This work was focused on integration and testing, and implementing the code for navigation towards a scent source. Caroline set up the main logic for scanning in place upon detection of a scent (decided by a threshold), and I helped debug the code. I later set up the code so that detection is set based on gradients and an increasing slope of the best-fit line over a period of one second. We tested out the code using the ENS160 TVOC readings for the alcohol scent. We then switched over to the Grove VOC reading instead as it was giving us more reliable results. We also concluded that the inclusion of a pump to suction air into the car was not a feasible solution to fix the problem of our sensors’ sensitivity and latency in pick-up. We will need a fan behind the source to blow the evaporating particles to ensure that the scent reaches the sensors in time. As we still don’t have the Arduino Mega, we cannot read from multiple sensors at once due to memory constraints and cannot use a combination of sensor readings to predict our results as we intended to at the beginning. We fixed the robot motor issues with some speed control and tuning, which now causes the robot to go faster than we wanted but with less skidding. I also changed the random exploration to compute the next set of coordinates in terms of polar coordinates rather than cartesian. The progress is slower than what I wanted, but given the constraints of not having all the parts and the interim demo coming up, we are trying to prepare as best as we can for the upcoming week. Next week, I hope to switch over to the Arduino Mega and be readings from all three sensors and to test out the code with multiple different substances to ensure that we can differentiate between them.

Aditti’s Status Report for 3/25

This week I worked on random exploration code for the robot and integrated the sensor reading logic with the motion code. The code is now set up to have two state machines that update and sample at different frequencies without interfering with each other’s timing and logic. The sensor updates take place at a frequency of 10Hz while the encoder updates take place every 50ms. The robot now randomly computes a coordinate within 15cm of the current position and travels to it until it detects a TVOC reading above a certain threshold, after which it stops. The robot resumes traversing the map randomly once the readings fall below the threshold. During this process, I identified several issues. First, there was a bizarre issue with our encoders – every now and then, the count increment on the encoder would jump from a simple double-digit value to a value in the thousands which should never occur. We identified and ruled out several potential causes for this – issues with update logic, integer overflows, etc. We still don’t know the exact cause, but it is likely due to hardware issues, and we have a band-aid fix to stop the motors and ignore the previous iteration in case this happens. We lose 50ms worth of data, but it seems to work. Another major issue is that our program with wi-fi communication takes up too much space to run smoothly due to memory usage issues. We will need to switch to an Arduino Mega with the Grove shield for the same to have enough memory. Another issue is that our robot seems to struggle with rotations every now and then on the boards that we got, but seems to work well on smoother surfaces. We are looking into more ball casters that may potentially help with this. Due to these setbacks, I will be slightly behind on schedule but once we have the new parts it should help get us back on track. Next week, I will prepare for the Interim Demo and continue to work on navigation towards a scent source.

Aditti’s Status Report for 3/18

This week I worked on motor tests for the robot. I worked on getting the robot to travel a fixed distance straight ahead, as well as rotate to a specific angle. I was able to get the robot to travel to within 2cm of the required coordinate and within 3 degrees of the required angle repeatedly. I then tested out the robot motion for a series of target coordinates to see how the error accumulates. During this process, I realized that the speeds for translation and rotation need to be tuned independently to ensure that there is no overshoot in either of them. A higher rotation speed and a lower translation speed were required. I also noticed that mid-range PWM signals were not enough to be able to generate enough power for the robot to move, which makes it difficult to implement PID control as the actual robot speed does not scale proportionally with the PWM signal. After tuning error margins, I was able to get the robot to automatically compute rotation angles from a series of coordinate inputs and follow them, however, the accumulated error is too high by the end of the 4th coordinate that the robot cannot converge to the target position. Having a tethered connection while testing to monitor the coordinates also skews the robot’s performance due to the additional forces on the robot. The robot is also completely unable to move on rough and textured surfaces such as carpet floors. I will continue to work on this for the next week and make the coordinate generation code randomized.

Aditti’s Status Report for 3/11

For the last week, I worked on getting the motors mounted on the robot and getting the robot to drive a fixed distance in a straight line. Currently, we are interrupting on a hardware interrupt pin and reading from a digital pin for the hall sensor readings for each motor, and using these to update the encoder ticks. Then, the distance travelled by the left and right wheels is computed using the gear ratios and robot geometry in cm, and the robot continues forward till it reaches the target distance without PID position or speed control. From initial testing, we found that our wheels and motors were not secure enough to ensure repeatable motion from the robot, although the traversed arc length was always within 2-4cm of the target distance. We tried to secure the motors better using laser-cut brackets to hold them in place. This helped, but we will still need a better way to secure them in place. For this we will be gluing down the motors/screw them in so that there is no shakiness. Once this is done, we will assess if there is a need to implement PID control or if the robot’s performance is good enough for our purposes. I also helped out with getting the sensors set up for reading and calibration, and spent a considerable amount of time working on the report last week. Next week, I will be focusing on securing the motors, working on the tuning code, and implementing PID control if needed. 

Aditti’s Status Report for 2/25

This week I worked on assembling the robot and motor control. Caroline and I worked on laser cutting all the pieces of the robot chassis which took a few iterations. The first iteration of the top piece was cut on 6mm wood which proved to be too thick for achieving the bending effect we were going for. We recut the top piece on 3mm wood and adjusted the other pieces accordingly. We also realized our ball caster is too big, causing our robot to tilt backward, so we’ll have to get a new one of the appropriate size. I worked on interfacing the L298N with the DC motors and controlling them through the Arduino. I initially struggled with getting the encoders to read correctly using hardware interrupts but eventually figured out the timing and update rules. I also implemented PID control for position control for one motor and will work on extending this to the robot position next week. We still need to test the current code with the motors mounted on the robot to ensure that we can achieve straight-line motion, which I plan to do next week. Additionally, I worked on preparing for the design review presentation and planned out connections and interfaces for the design report. 

Next week I will continue to work on motor control and odometry and plan to get the logic working for random exploration on the field. My progress is currently on track with the schedule. I will also be working on the design report over the next few days. 

The updated test code and CAD files can be found on the github repository: https://github.com/aditti-ramsisaria/ece-capstone 

Aditti’s Status Report for 2/18

This week I focused on finalizing the robot CAD. I worked with Caroline to develop an effective layout for our parts on the robot and did the SolidWorks for the base plate. I also worked on researching controller strategies like pure pursuit and PID control for robot motion and finalized some algorithms – down-sampling and wavefront segmentation for vision, A* with visibility graphs for global path planning, and Runge-Kutta and PID control for motion. Some of the courses that cover these topics include: 18-370 (Fundamentals of Controls), 16-311 (Intro to Robotics), 16-385 (Computer Vision), and 18-794 (Pattern Recognition Theory).

I also started looking into alternatives for multi-threading on the Arduino as we discovered that we will need to have the robot sensing and moving at the same time to make good use of time. So far, I have found some approaches that make use of millis() based timing instead of delay() along with OOP to make a state machine on the Arduino. Additionally, I worked on preparing the slides for the design review and preparing for presenting next week. We plan on laser-cutting the robot chassis tomorrow (Sunday 02/19) and assembling the robot as and when the parts come in. Next week I plan on starting to test out the motors and calibrate them so that the robot can move along a fixed path. This will involve implementing the PID controller and odometry as well. Since we will be testing out the alternative approach of random motion before path planning, I’ve decided to prioritize odometry over path planning for the moment. 

Aditti’s Status Report for 2/11

This past week I focused on researching different sensor modules and robot chassis for our use-case scents and shortlisted parts for ordering. I also worked on the CV pipeline for finding centroids of various objects on a map by thresholding and segmenting. The model takes in a raw camera image, maps it to a 4×4 grid, and finds the centroid coordinates of each object on the map. The code still needs to be tested with the images from the actual camera that we will end up using. Since we are behind on ordering parts, I think robot assembly + CAD will have to be completed within the next week, which is why I decided to work on the segmentation section this week. By next week I aim to finalize the robot CAD and assemble the robot and also start thinking about path planning logic.