Final Video
Team Status Report for 12/6
Overall Progress
- Improved UWB direction
- LiDAR and obstacle avoidance integration
Risks & Management
We are still working on fine-tuning the obstacle avoidance algorithm, which has caused us to fall behind on our testing. We plan to improve our obstacle avoidance as much as possible within a certain timeframe and spend the rest of our time on testing.
Design Changes & Justification
A few weeks ago, we switched to GPS since the UWB direction was not accurate. However, we realized while testing GPS that it was highly inaccurate, and after talking with the professors for help, we reverted back to using UWB. We are now using a 2 anchor system instead of 1 anchor system to get the distance. This has been much more reliable in getting the distance between the user and the cart.
Testing
- Load capacity: Can carry up to 30 lbs and still maintain control
- Speed: Moves at an average of 3.58 mph
- Runtime: Calculated that the battery packs used to supply power will last for more than 45 minutes, estimated to last roughly 2 hours
- UWB distance accuracy: Not fully tested but verified that the UWB data provides an accurate estimate of the distances
- Following distance: Cart can maintain a following distance of 2 feet and will stop if the user and cart are too close and start moving when the distance is more than 2 feet
- LiDAR obstacle detection: LiDAR can detect obstacles with an average 3 cm difference from the actual distance in real life
- Obstacle avoidance: Cart is able to stop within 1 foot of obstacles
- The cart struggles to navigate away from obstacles and out of corners and walls so we are tuning the obstacle avoidance algorithm
- Data latency: Data of the user’s distance from cart to the motor commands executing occurs within 200ms
Elly’s Status Report for 12/6
This week, I worked on helping to integrate all the subsystems together. During the first half of the week, I helped with getting the cart to follow the user based on the UWB distance through testing the connection between the phones and ensuring that the distance measurements were accurate. After we ensured that the distance measurements were what we expected, I moved onto integrating the UWB coordinates with the LiDAR and path finding algorithm.
The code on our RPi 5 receives the UWB data via UDP, runs a C++ executable as a subprocess to get the nearby LiDAR data and uses this to calculate the repulsive and attractive forces to get the general direction of the cart. Then serial commands are sent to the Arduino to control the physical motors. We process the incoming UWB and LiDAR data by pairing the timestamps to ensure that the cart is not making decisions based on old data. Originally, we planned to just use the virtual force algorithm, but after testing, we added “states” to handle when the cart gets stuck. There is the driving state where the cart simply follows the user and there are no obstacles detected. The next state is when the cart is blocked and spins in place to find a clear space. The last state is when a clear space is found and the cart moves forward to escape from being stuck. These states are based on forces calculated from the virtual force algorithm.
For remaining tasks, we are still tuning and testing the obstacle avoidance as there are a few cases where the cart does not behave as expected. For example, when trying to escape from being stuck, the robot will spin in place to find a clear area, but sometimes, it will find a clear area that is in the opposite direction of the user, causing it to move away from the user.
Rose’s Status Report for 12/6
We have made substantial progress since the last status report. Previously, I have attempted to determine the direction relative to the cart using:
- A single UWB anchor, which did not allow for direction measurements due to Apple’s privacy restrictions
- GPS, which proved too inaccurate during motion tests.
Now, based on a suggestion from Tamal, I have switched to using a two-anchor approach for triangulating the direction of the shopper (or tag) using two independent UWB NI sessions. Using two anchors, one mounted on the left side and one mounted on the right side of the cart, we are able to obtain much more accurate and reliable data. Each anchor independently ranges to the shopper (the tag), and both send their distances to the Raspberry Pi via UDP.
This redesign required:
- Creating two independent NI sessions on the same device (for the shopper)
- Synchronizing and comparing left/right distances to obtain relative direction
- Adjusting packet formats and timestamps for reliability
- Testing mounting positions and spacing to reduce jitter.
Here is a visual of the two anchor one tag system:

Source: https://hackaday.io/project/25406-wild-thumper-based-ros-robot/log/152597-follow-me-part-4
I also ran into numerous issues, and had to do some careful isolation and checking the role pairs to ensure that the carts do not connect to each other and then send their distance to the Raspberry Pi (we should only want distance to shopper). I rewrote the session startup flow such that each iPhone broadcast its role (cartLeft, cartRight, or shopper), and then checking that the interaction is a valid iteration (not cartLeft-cartRight) BEFORE creating an NI session, exchanging tokens and beginning to range. I also limited the cartLeft and cartRight roles such that they are only advertising and never browsing. Only the shopper is browsing for new connections.
I also helped with improving the stability of the cart and debugging the path-finding. We paired UWB distance measurements based on timestamps to reduce using stale data, and we are also very actively working on integrating the LiDAR such that the robot is able to navigate around obstacles.
I would say that the UWB subsystem is pretty much complete and on schedule. I want to do more work on the mobile app, including implementing cart controls (ie: start, stop, connect, reconnect, disconnect) on the shopper side. Additionally, I will also be tuning the obstacle avoidance behavior.
Audrey’s Status Report for 12/6
This week, I worked on debugging the UWB two-anchor system along with LiDAR obstacle detection pathfinding. Since most of our debugging for this was a group effort, there was nothing specific I worked on, but rather everything as a whole. I figured out that our meccanum wheels were on backwards, which caused the robot’s movements to be weird. For example, when it turned, it stuttered and often didn’t move. With the wheels flipped in the correct direction, the robot’s movements are much smoother, and its turning is better. I also finalized the wiring for the robot, along with the placement of all components in the basket and on the robot.
Overall, I am not behind, but the team is, so I am helping other team members integrate and debug their subsystems so that our robot can reach final demo status. Next week, or really the next two days, I will be helping with debugging LiDAR obstacle detection and pathfinding, making a mock aisle obstacle course for the demo, and creating a script and possibly slides for the final demo on Monday.
