https://drive.google.com/file/d/16BUmcdUff1Xxrdo6RqtjcvHanefeLMUc/view?usp=sharing
(PDF exceeded the maximum upload size).
ECE Capstone Project, Spring 2022: Jai Madisetty, Keshav Sangam, Raymond Xiao
https://drive.google.com/file/d/16BUmcdUff1Xxrdo6RqtjcvHanefeLMUc/view?usp=sharing
(PDF exceeded the maximum upload size).
This week, I worked on setting up the testing environment. Due to size constraints in HH and lack of materials, we decided to use a 10×16 ft^2 environment. We modified the design a bit due to the large radius of the robot and its difficulty to maneuver the narrow obstacle course. Additionally, I finished up with the DWA path planning algorithm, modifying some of the logic in the frontier search portion of our code. Right now, all that is left is integration and tuning our overall system. Due to our robot’s autonomous nature, it is necessary to test our system with headlessly, which is quite difficult due to network latency. Our SLAM and path planning algorithm aren’t exactly synchronous in real time. As a result, we plan on slowing down our robot’s movement so it will be easier to test.
Throughout the week, we finished programming our DWA path planning algorithm. There are some struggles still left, however. Although our path planning works correctly in simulation, there are integration challenges when moving from simulation to reality. In particular, the SLAM algorithm matches the lidar point clouds very poorly during sudden large accelerations and fast speeds. Unfortunately, autonomously moving involves large accelerations in both the translational and rotational senses. This is not much of a problem in simulation since there were lots of CPU resources available since the SLAM is not running in real time. Moving SLAM to real time severely limits the computational resources for the rest of our algorithms (including path planning and our AruCo detection). The translational acceleration is less of a problem, since the SLAM algorithm can keep up with it through a decent range of speeds. Tomorrow, we are focusing on fine tuning the maximal rotational velocity and acceleration curves to ensure the map we generate of the environment is as accurate as possible. In essence, our work leading up to the demo and final report primarily revolves around integration and optimization, which we can do in parallel with our subsystem verification and system validation.