Team Status Report 08/05/21

This week we mainly fixed up the kinks during integration. We all worked together to try and iron out some of the path planning issues that we were seeing when testing the lead vehicle’s navigation system. Part of the reason for the inconsistency was the frequency of updates of both the path planning node and the localisation node. After decreasing the frequency of updates we were able to observe much better performance .

We also tweaked with the weight distribution of the cars to give it a more smooth motion and achieve less drift overall. Some of the differences in material and other irregularities with the optical optoencoder and wheel dynamics made the vehicles operate slightly differently and it took a little bit of calibration to allow the vehicles to move in a somewhat similar manner. With more time we would have liked to see identical movement by implementing better feedback and maybe passing more specific data between the cars, however, due to time constraints we were left with small hacks to make the motion pretty similar.

Joel also tweaked some issues with the communication to allow for more consistent communications.

The rest of the week was dedicated to testing the system together to try and iron out any issues we saw. Although the vehicles are a little finicky and their behaviour is a little unpredictable due to subtle changes in the positions of the motor axes, as well as accumulation of dirt in the wheels and terrain and small perturbations in the vehicle themselves, we were able to have our two cars successfully navigate a simple track with 5 obstacles.  Since the drift between the two vehicles became too significant after 5m, we ended up shortening our demo video to a track of 5m in length.

Jeffrey’s Status Report 08/05/21

This week I focused on getting the project ready ready for the demo. We fixed some issues with the following car not going straight by adding weights to counterbalance the weight distribution. This allowed the car to drift less and led to more accurate localisation and odometry data which helped with the movement of the following vehicle. We also ironed out some of the kinks with the path planning of the lead vehicle. To accomplish this we had t fix some issues with the object detection, mainly we created a history of depth values of the detected object and if the object detection spit out a depth of 0, we would simply use the previously detected depth as an approximate. This allows for a much smoother object detection algorithm and more consistent objects for path planning. We were also working on integrating the lead and following vehicle together, however, ran into an issue with the following vehicle’s odometry, which resulted from a faulty optocoupler. In the next few days we have to finish filming the demo of the lead and following vehicles working together.

Joel’s Status Report for 05/08/2021

For this week, we focused on getting things ready for the final demo and videos. I worked with Fausto on getting an outline done for the video and we have started to get some of the footage together to start the editing process.

To get our cars ready for this week, we had to fix some issue with drift that came from the mechanics of the vehicle. After spending time in the lab we were able to address most of the drift concerns and now have two vehicles ready for the convoy task.

This week we also had to tweak a little bit of the object detection code to make it more consistent with updating the obstacles. This included a timeout period so that obstacle locations were not spammed. We also added so extra code to mitigate cases where the object depth were not reported.

Lastly I added slight modifications to the communications portion of the code to make sure that the vehicles could communicate seamlessly. With this component polished, we are ready to demo a fully functional project.

The rest of the time these next few days will be spent on finalizing the last couple of deliverables for the project. Namely, the poster, final video and final report.

Fausto’s Status Report [05/08/2021]

This week we worked on some final capstone deliverables and worked on wrapping up some lose-ends in our project. Fortunately, we were able to fix steering issues we were experiencing on the second Autovot. We essentially disassembled it and put it back together carefully, tweaking parts of the motor components, switched a few parts and the drift was negligible afterwards. We have come a long way with our project and will work on demos and presentation details the rest of the week.

Fausto’s Status Report [05/01/2021]

This week I worked closely with Joel and Jeffrey with integration. There were issues in the beginning of the week with the vehicle drifting slightly to the left, even if there was no obstacle in the way. We solved this by fine-tuning the Arduino code responsible for controlling the Autovot wheels. This included assuming the vehicle’s battery is around 8V. Additionally, we worked hard on the path planning algorithm and added a bubble of error around obstacles so that we are less likely to hit an obstacle. I also added small changes to the bluetooth communications to support messages which can send obstacle locations to the trailing car. Next week will include adding length to the course and adding more obstacles. We also worked on the final presentation and have some good videos to showcase for our presentation.

Joel’s Status Update for 05/01/2021

This week the team worked on integration and getting the system to work completely end to end. This mostly included refining the path planning algorithm to be more consistent and resistant to drip in the data.

In particular, the vehicle mechanics are build in such a way that the car tends to one direction rather than straight. This was due to the wheels of the vehicles not being completely perpendicular. In order to mitigate the issues with physical vehicle drift we played around with PID control to try to maintain wheel velocity at a constant. On top of this we also added an acceleration period to allow the vehicle to build up speed instead of instantly trying to change from one speed to another. This produced much more consistent results and helped to reduce drift but we still saw a little bit of side drift. We also tried to map voltage to our motor velocity scale so that we could get better accuracy for odometry at a given battery voltage (ex: 8v)

We also integrated the object detection algorithm into the flow of the system. For this when we detect an object we report the objects location to the path planner. The planner then adds the obstacle to the map and runs A* algorithm to get a new path for the vehicle to take. The camera lens is also not quite at the center of the vehicle so we have to account for this by adding a slight offset in the Y direction. For now the system gives reasonable results; however, we would still like to see more consistency.

The next week we would like to add additional obstacles, tighten up our systems to be more robust and consistent and add the second vehicle to fully test communication and coordination capabilities.

Jeffrey’s Status Report 1/5/21

This week I mostly worked with Joel and Fausto together on integration. We amended our path planning algorithm to rely on A*, that would be rerun every time an obstacle is found. Therefore, navigation would depend on localisation data, and sending discrete x,y and tile locations that the car must get to. These locations are dynamically updated every time A* is run when a new obstacle is found. After making these changes, we gave the car a straight path, but found that there was significant drift in the car’s odometry data. Therefore, we implemented a PD controller to allow the wheels to get feedback so that the wheel velocities would be fairly consistent. We also developed a ramp up and ramp down period to help with odometry, so there would be a number of steps before the wheel velocities hits the target instead of immediately ramping the wheels to the target velocities. We found better performance in odometry data after implementing these changes. This allowed the car to go straight. We then started integrating object detection and path planning and had some difficulties moving around the object. The issue is with figuring out the next location to go to to change directions. This still needs to be tinkered with as sometimes the vehicle fails to plan around the object but it’s performing well.

Next week we plan to continue to fine tune the path planning and object detection of the lead car, as well as begin to test the following car. Our current ideas is to either send object locations and have the follow car run A* as well, or for the lead car to send the same landmarks that it follows for the path to the follow car. We expect the latter to have difficulties because of the differences in drift between the two cars, but further experimentation is needed.

Team Status Report [05/01/2021]

This week our team worked together in the Hammerschlag Hall in order to integrate our subcomponents.  We had to first fix a drift in our vehicle mechanics when driving. We spent the first half of the week working on solutions and after fine tuning the Arduino code for controlling the wheel motors we were able to minimize the car’s drift to a negligible amount. This also included tuning the car for a voltage of around 8V.

Afterwards, we focused on path planning. This consisted of ensuring that our map could easily store obstacles in order for the vehicle to evade them when it approaches it. When an object is detected, we store the coordinates in a grid and use an a* path planning algorithm in order to evade the obstacles.

Here is a video of our Autovot evading bottles in a 5 meter track.

https://photos.google.com/share/AF1QipM4vXyGAEwav4UiZpMWxbm0fC-az3GwzySdDtf6znFCiBmR_U6mEHh3kuwSQ96Cbw/photo/AF1QipNYxix8M9KpOIC2CBKNjz5KXW2yVcWYmDC4Es_f?key=aWlpZkhjcEhRMjR4VUEzbGlaQ0hhVXpJY0paTmR3

Additionally, we spent sometime in the end of the week working on the final presentation. We are almost done with it but prioritized pushing the progress of our project and are at a very good place. Next week will include increasing the track length and adding more obstacles and testing to see how our Autovot reacts to that. Finally, we will have to test how well the following Autovot evades the obstacles using the lead car’s messages.

Jeffrey’s Status Report 24/4/21

This week we focused on finalising the object detection algorithm and porting it into ROS for integration purposes. We originally ran into an issue where the network we were using was too big and consuming too much power and memory on the Jetson causing the frame rate to be too low to be useful. Upon further research, we realised that neural networks on the Jetson are supposed to be developed using TensorRT rather than vanilla Tensorflow. Primed with this information, we went back to the Jetson development page to look at examples of neural networks using TensorRT, and converted the object detection module to use Jetson documentation and TensorRTs. From here, we then ran it on the Jetson Nano and mounted the camera on the car to see if the object detection algorithm was working. We found that using gatorade bottles worked consistently, as in the algorithm could routinely detect and plan around the gatorade bottle. I also worked on converting the object detection algorithm into ROS to be able to integrate and communicate with the other nodes.

In the next few days, we will begin real time testing on the autonomous navigation capabilities of the car, primarily its ability to navigate around obstacles, and we will also begin testing the following capabilities of the following car. The following car will rely on a map that the lead car sends to trace the lead car’s path through the course.