This week, I worked on helping to integrate all the subsystems together. During the first half of the week, I helped with getting the cart to follow the user based on the UWB distance through testing the connection between the phones and ensuring that the distance measurements were accurate. After we ensured that the distance measurements were what we expected, I moved onto integrating the UWB coordinates with the LiDAR and path finding algorithm.
The code on our RPi 5 receives the UWB data via UDP, runs a C++ executable as a subprocess to get the nearby LiDAR data and uses this to calculate the repulsive and attractive forces to get the general direction of the cart. Then serial commands are sent to the Arduino to control the physical motors. We process the incoming UWB and LiDAR data by pairing the timestamps to ensure that the cart is not making decisions based on old data. Originally, we planned to just use the virtual force algorithm, but after testing, we added “states” to handle when the cart gets stuck. There is the driving state where the cart simply follows the user and there are no obstacles detected. The next state is when the cart is blocked and spins in place to find a clear space. The last state is when a clear space is found and the cart moves forward to escape from being stuck. These states are based on forces calculated from the virtual force algorithm.
For remaining tasks, we are still tuning and testing the obstacle avoidance as there are a few cases where the cart does not behave as expected. For example, when trying to escape from being stuck, the robot will spin in place to find a clear area, but sometimes, it will find a clear area that is in the opposite direction of the user, causing it to move away from the user.
