We have made substantial progress since the last status report. Previously, I have attempted to determine the direction relative to the cart using:
- A single UWB anchor, which did not allow for direction measurements due to Apple’s privacy restrictions
- GPS, which proved too inaccurate during motion tests.
Now, based on a suggestion from Tamal, I have switched to using a two-anchor approach for triangulating the direction of the shopper (or tag) using two independent UWB NI sessions. Using two anchors, one mounted on the left side and one mounted on the right side of the cart, we are able to obtain much more accurate and reliable data. Each anchor independently ranges to the shopper (the tag), and both send their distances to the Raspberry Pi via UDP.
This redesign required:
- Creating two independent NI sessions on the same device (for the shopper)
- Synchronizing and comparing left/right distances to obtain relative direction
- Adjusting packet formats and timestamps for reliability
- Testing mounting positions and spacing to reduce jitter.
Here is a visual of the two anchor one tag system:

Source: https://hackaday.io/project/25406-wild-thumper-based-ros-robot/log/152597-follow-me-part-4
I also ran into numerous issues, and had to do some careful isolation and checking the role pairs to ensure that the carts do not connect to each other and then send their distance to the Raspberry Pi (we should only want distance to shopper). I rewrote the session startup flow such that each iPhone broadcast its role (cartLeft, cartRight, or shopper), and then checking that the interaction is a valid iteration (not cartLeft-cartRight) BEFORE creating an NI session, exchanging tokens and beginning to range. I also limited the cartLeft and cartRight roles such that they are only advertising and never browsing. Only the shopper is browsing for new connections.
I also helped with improving the stability of the cart and debugging the path-finding. We paired UWB distance measurements based on timestamps to reduce using stale data, and we are also very actively working on integrating the LiDAR such that the robot is able to navigate around obstacles.
I would say that the UWB subsystem is pretty much complete and on schedule. I want to do more work on the mobile app, including implementing cart controls (ie: start, stop, connect, reconnect, disconnect) on the shopper side. Additionally, I will also be tuning the obstacle avoidance behavior.
