Rose’s Status Report for 12/6
We have made substantial progress since the last status report. Previously, I have attempted to determine the direction relative to the cart using:
- A single UWB anchor, which did not allow for direction measurements due to Apple’s privacy restrictions
- GPS, which proved too inaccurate during motion tests.
Now, based on a suggestion from Tamal, I have switched to using a two-anchor approach for triangulating the direction of the shopper (or tag) using two independent UWB NI sessions. Using two anchors, one mounted on the left side and one mounted on the right side of the cart, we are able to obtain much more accurate and reliable data. Each anchor independently ranges to the shopper (the tag), and both send their distances to the Raspberry Pi via UDP.
This redesign required:
- Creating two independent NI sessions on the same device (for the shopper)
- Synchronizing and comparing left/right distances to obtain relative direction
- Adjusting packet formats and timestamps for reliability
- Testing mounting positions and spacing to reduce jitter.
Here is a visual of the two anchor one tag system:

Source: https://hackaday.io/project/25406-wild-thumper-based-ros-robot/log/152597-follow-me-part-4
I also ran into numerous issues, and had to do some careful isolation and checking the role pairs to ensure that the carts do not connect to each other and then send their distance to the Raspberry Pi (we should only want distance to shopper). I rewrote the session startup flow such that each iPhone broadcast its role (cartLeft, cartRight, or shopper), and then checking that the interaction is a valid iteration (not cartLeft-cartRight) BEFORE creating an NI session, exchanging tokens and beginning to range. I also limited the cartLeft and cartRight roles such that they are only advertising and never browsing. Only the shopper is browsing for new connections.
I also helped with improving the stability of the cart and debugging the path-finding. We paired UWB distance measurements based on timestamps to reduce using stale data, and we are also very actively working on integrating the LiDAR such that the robot is able to navigate around obstacles.
I would say that the UWB subsystem is pretty much complete and on schedule. I want to do more work on the mobile app, including implementing cart controls (ie: start, stop, connect, reconnect, disconnect) on the shopper side. Additionally, I will also be tuning the obstacle avoidance behavior.
Rose’s Status Report for 11/22
This week, I made a lot of progress on the navigation functionality and determining the direction/relative angle from one iPhone to the other (required for enabling the robot to follow the person). After several attempts to use UWB direction vectors (described in previous status reports), I implemented a GPS-based solution. Each iPhone gathers its own GPS coordinates and compass heading while also receiving the peer device’s GPS data. Using these values, I compute the bearing from my phone to the peer phone and then subtract my own heading to obtain an angle between -180° and 180°. This relative angle is packaged into a JSON message and sent through a UDP pipeline to the Raspberry Pi (currently my laptop during testing). On the Python side, I updated the listener to parse these new messages, pass the values to the robot controller, and also filter incoming packets to ensure that only commands from the intended iPhone are used. I integrated this with the whole robot system, watched the motors turn left and right based on the iPhones’ positions and read the live terminal logs, which show the relative angle updating in real time. A reason I had not used GPS earlier is that it is inaccurate indoors. Based on the terminal logs, it is, on average, accurate within 5-10 meters. For now, this will need to work.
Throughout this project, the three main things I had to learn were iOS development, UWB, and networking. I learned Swift and how to build and deploy apps to physical iPhones, including Xcode setup, device provisioning, and handling permissions. I also extensively researched how Apple’s UWB and Nearby Interaction frameworks work, how they provide distance and direction, and what limitations they have. On the networking side, I studied Bluetooth, TCP, UDP, and IP to figure out how to connect the iPhones to the Raspberry Pi. I decided on a UDP-based pipeline, which required learning sockets, JSON message formats, and data handling. I also experimented with small Flask servers earlier on. Most of this learning came from Apple documentation, online tutorials, open-source examples, and hands-on debugging on real devices, including reading logs and iterating until everything worked.
I’m currently behind schedule because determining the direction between the two iPhones took much longer than expected due to numerous issues (described in previous status reports). To catch up, I’m focusing on making the GPS-based navigation pipeline more stable instead of trying to debug UWB. My immediate goals are to smooth the relative-angle values, reduce jitter, and clean up noisy GPS input so the robot receives more consistent turning commands.
Next week, I want to add some kind of filtering to the relative angle calculations, either with a moving average or a Kalman filter and work with my teammates to improve the robot’s control logic so movements are smoother and less jerky. If time permits, I may revisit UWB to see whether I can troubleshoot it further or if its data can be combined with GPS for better accuracy, but the primary objective is to finalize a reliable, functional navigation system based on the GPS progress I achieved.
Rose’s Status Report for 11/15
This week, I focused on getting our interim demo integrated and working. Since the Raspberry Pi did not arrive in time for the demos, I used my computer as a substitute. I modified the mobile app to transmit UWB direction and distance data via UDP sockets instead of only printing to the console. I then wrote a Python listener script on my computer that received this UWB data in real time from the iPhones. This script forwarded the received data to another module responsible for communicating with the robot control system running on the Teensy. This pipeline allowed us to successfully demonstrate the system’s end-to-end functionality, showing that UWB data could be transmitted from the iPhones, through a UDP interface, and finally to the robot controller.
Obtaining direction data is an ongoing issue. I did more research this week on it, and I’ve found no real solution. I integrated ARKit for camera assistance. Sources I read stated “On iPhone, it’s possible to receive direction for nearby third-party accessories in sessions that enable isCameraAssistanceEnabled.” Unfortunately, all this did was make our distance measurements more accurate, without actually enabling direction.
At this stage, I am behind schedule because of the unresolved issue of receiving direction information from the UWB connection. This problem is software-related and tied to Apple’s Nearby Interaction framework limitations when using camera-assisted direction. To catch up, I plan to continue debugging with the assistance of the professors and TA, explore potential workarounds, and perhaps find a way to do relative device orientation estimates. Once direction data is available, I will quickly integrate it into the UDP communication pipeline and test with the Raspberry Pi.
For verifying the UWB communication and data relay subsystem, I plan to measure UDP packet latency from the iPhone to the Raspberry Pi or computer to ensure responsive data transfer. I will also validate the accuracy of the distance measurements by comparing UWB distances against physical measurements. Once direction data becomes available, I will conduct angular accuracy tests to quantify directional error. Finally, I will run repeated tests to analyze the overall stability of the system, including packet loss, jitter, and consistency across the full pipeline.
‘direction’ variable documentation: https://developer.apple.com/documentation/nearbyinteraction/ninearbyobject/direction-4qh5w
Rose’s Status Report for 11/8
This week, I focused on debugging the direction measurement functionality for our UWB interaction. Currently, it only outputs distance values, but not direction or position, which is crucial for our demo.
Example output:

After looking into the supportsDirectionMeasurement property, I discovered that it is returning false on both our iPhone 15 Plus and iPhone 16 devices, even though both should theoretically support direction measurements, according to all the documentation I found online. They both have a U2 UWB chip that should support the collection of direction data.
I have spent a lot of time tracing through the entire app to check whether the issue is a result of the iOS version, session configuration, missing permissions, device settings, or a mismatch in how the capabilities are enabled.
Unfortunately, the Raspberry Pi 5 (which I ordered two weeks ago and was expected to arrive on Thursday) still has not arrived, so I have been unable to proceed with the full integration and testing. The missing hardware continues to be a major blocker for testing the iPhone receiver connection and for controlling the robot with the recorded data.
At this stage, I am behind schedule due to the ongoing delay in receiving the Raspberry Pi 5. Without it, I can’t run the communication pipeline between the iPhone and Pi. However, I’ve continued to make as much progress as possible on the software debugging side to minimize further delays once the hardware arrives. To catch up, I plan to put in extra hours next week and begin integration as soon as the Raspberry Pi arrives.
Nearby Interaction supportsDirectionMeasurement documentation: https://developer.apple.com/documentation/nearbyinteraction/nidevicecapability/supportsdirectionmeasurement?language=objc.
Rose’s Status Report for 11/1
This week, I worked on connecting the receiver iPhone to the Raspberry Pi to begin transferring UWB position data to the cart’s control system. Unfortunately, I discovered that the Raspberry Pi 5 we borrowed from the ECE inventory has a broken SD card holder, which prevents it from booting or running any code. It had old solder on the pins, which likely means that others have also previously tried to fix it with no success. I immediately ordered a replacement that should arrive next week (Amazon Prime). In the meantime, I decided to make more progress with the software side. I wrote the Flask server code that will run on the Raspberry Pi to handle incoming data from the receiver iPhone. Although the code has not been tested yet due to the hardware issue, it is fully written and ready for deployment (and testing) once the new Pi arrives.
I also spent some time researching the trade-offs between using Bluetooth and HTTP as the communication method between the receiver iPhone and the Raspberry Pi. I found that HTTP over Wi-Fi has higher data throughput and stability for streaming continuous position updates, while Bluetooth has lower power consumption and a simpler pairing method, as it does not rely on a Wi-Fi connection. However, Bluetooth has limited bandwidth and higher latency, which makes it less reliable for real-time data updates. Based on this, I decided that using HTTP over Wi-Fi is ultimately the more practical choice for our project at this stage.
Overall, I remain behind compared to the Gantt chart because the broken Raspberry Pi delayed my progress. Next week, once the new Raspberry Pi arrives, I plan to set up the Flask server, test end-to-end communication between the iPhone and Raspberry Pi, and look into how we can translate those coordinates into motion controls without obstacle avoidance for a preliminary prototype. My goal is to catch up on lost time and have the full UWB-to-cart communication working for the interim demo.
Rose’s Status Report for 10/25
This week, I have officially finished establishing the UWB connection between the two iPhones. After establishing a connection, the two iPhones can now transmit live position data to each other, representing their relative locations. This is currently in the form of 3D coordinates (x, y, z), logged onto my computer’s terminal.
The code for the UWB live location updating can be found here, in a new branch: https://github.com/rosel26/basket-buddy/tree/uwb
I’ve also started working on the server that connects the receiver iPhone and the Raspberry Pi. I plan to use Bonjour and TCP. The planned process is as follows:
- Pi runs a TCP server, accepts the TCP connection, and receives the JSON position data
- iPhone receiver discovers the Pi via Bonjour, sends a new payload every update
According to our Gantt chart, my progress is still behind schedule, as I should have the preliminary person following functionality working by this point. A challenge I faced was establishing a stable connection between the two iPhones, and it took more time than I expected to research the frameworks I had to use.
For this week, I will focus on integrating the UWB output with the Raspberry Pi to start inputting position data into the cart control system. This involves establishing a communication link between the receiver iPhone and the Raspberry Pi as specified above. Once that connection is established, the next step will be to translate the location data into motion commands. If the robot control system and Teensy interface are ready, I will look into and work on converting position coordinates into velocity and directional commands to enable the Teensy to control the cart’s movement to start following the person (and their iPhone).
Team Status Report for 10/18
Overall Progress
- Finished design report
- Finalized implementation details for project
- Ordered robot car kit
Risks & Management
The design report took longer than expected, causing all of us to be behind in our work as we dedicated last week to the report. Therefore, we will have to make up for last week’s work this week.
Due to a communication error, we realized that our robot car kit, Teensy 4.1 microcontroller, and H-Bridges were never ordered. We found this out on Wednesday and promptly ordered the parts the same day. Unfortunately, due to Fall Break, all group members were out of town when the parts arrived. Thus, we have been unable to make progress on the hardware aspect of this project. This communication mistake poses a significant risk to the overall Gantt Chart as it will delay the work by a whole week. However, we scheduled a slack week right before the interim demo, so as long as we continue to stick to our schedule, we should have a working prototype for the interim demo.
Design Changes & Justification
There were no changes on the hardware side, but on the software side, we decided to use a local database for product barcodes, names, and prices instead of an API. This change was necessary to improve the user experience and does not incur any additional costs.
Additionally, all hardware and software connection methods have been researched and are detailed in our design report. We have collaboratively created this block diagram:

Product Solution Meeting Needs
A was written by Rose, B was written by Audrey, and C was written by Elly.
Part A: Global Factors
Around the world, grocery shopping can be physically demanding and time-consuming, especially for people with limited mobility, older adults, or parents managing children while they shop. Basket Buddy addresses this by reducing the effort needed to push and steer a cart, making shopping more convenient and enjoyable for everyone.
In addition to making shopping more accessible, Basket Buddy also fits into the global trend of automation and smart retail technology. As more stores move toward self-checkout and contactless shopping, our project contributes to this shift by introducing a smart, user-friendly cart that improves both convenience and safety. With UWB and LiDAR-based navigation, it should have reliable performance in many types of indoor retail environments, making it adaptable to stores of different layouts and sizes around the world.
Part B: Cultural Factors
Basket Buddy showcases cultural values such as independence, community care, and accessibility. Many cultures view assisting others as a moral good; Basket Buddy supports this viewpoint through assisting people with their shopping. Basket Buddy also reflects cultural beliefs in equality and inclusion, allowing everyone to shop without stigma or dependence on others. Additionally, through the built-in safety measures, Basket Buddy respects both the moral and legal expectations of being safe in most communities.
Part C: Environmental Factors
The environmental impact of Basket Buddy is mainly in the manufacturing and energy consumption of the cart. Unlike a standard shopping cart, Basket Buddy requires additional electronics, sensors, motors, and batteries. The manufacturing process of these components is resource-intensive, and relies on the sourcing of more materials and an increased carbon footprint compared to a traditional cart. Furthermore, if a company wanted to incorporate Basket Buddy into its grocery stores by replacing traditional shopping carts, the company would need to use a lot of energy to maintain the carts. Basket Buddy would require constant charging, becoming a significant consumer of electricity, or new parts if electronic components break down, causing more waste to be produced. If Basket Buddy were to be adopted by stores, these stores would need a plan for charging and maintenance to prevent carts from producing excess waste.
However, Basket Buddy also provides the opportunity for people to be more environmentally conscious about their purchases. Being able to view your items in the mobile app can guide shoppers to make more sustainable choices, and a digital checkout process eliminates the need for paper receipts. More features can also be added to Basket Buddy such as expiration dates to prevent food waste or highlighting products with eco-friendly packaging or local sourcing.
Rose’s Status Report for 10/18
This week, I primarily worked on finishing the design report with my teammates and finalizing several of the software-side design choices related to the UWB functionality. I decided that the best way to send information from the iPhone receiver to the Raspberry Pi would be through a local Wi-Fi HTTP bridge. In this setup, the Pi runs a lightweight HTTP server on the local network, and the mobile app on the receiver’s iPhone sends JSON packets containing UWB position data using HTTP requests.
I also conducted additional research into LiDAR drivers for the Raspberry Pi to understand better how sensor data will be processed in our system. Our RPLIDAR will connect to the Raspberry Pi via USB, and its driver allows the Pi to read and process distance and angle data from the sensor. This data is then converted into a 2D map showing nearby obstacles and open spaces. I explored various software options for running the RPLIDAR on the Raspberry Pi, including the official SDK, a lightweight Python library, and a ROS-compatible driver for real-time mapping. This research helped me better understand how the LiDAR subsystem will integrate with the rest of our navigation and obstacle-avoidance software.
According to our Gantt chart, my progress is behind schedule, as the goal for this stage was to finalize our mobile app and have the beginnings of shopper tracking over UWB. I was unable to dedicate as much time to these goals since we were working extensively on the design report.
To get back on track, I plan to finalize the mobile app next week and ensure the UWB session is fully functional between two iPhones. I will also focus on establishing the connection between the receiver iPhone and the Raspberry Pi, which involves setting up the Wi-Fi HTTP server on the Pi and linking the iPhone to it. Once this is in place, we’ll have a clearer understanding of how the data transmission and processing will work within the overall system.
Rose’s Status Report for 10/04
This week, I focused on integrating UWB technologies into our iOS app to enable shopper location tracking between devices. I set up the Nearby Interaction framework in Xcode, adding the required capabilities and permissions (Nearby Interaction, Local Network Usage). Then, I implemented code to initialize an NISession and use MultipeerConnectivity to exchange discovery tokens between iPhones. This allows two devices to establish a UWB session and start getting distance and direction information, acting as the foundation for the indoor location tracking we are aiming for.
I also built out the flow within the app so that tapping the “Connect” button on the Welcome screen initializes the UWB session. After this, I deployed the app onto two iPhones. The app now builds and runs on the two devices, simultaneously advertising and browsing for peers. Terminal and Xcode logs confirm that peer discovery and UWB session initialization attempts are working, meaning I’ve created that initial communication layer.
According to our Gantt chart, my progress is on schedule, as the goal for this stage was to establish a working UWB connection between two devices.
Next week, I plan to finalize the app and get the UWB session fully functional between two iPhones. My immediate goal is to have the devices successfully exchange discovery tokens and run a stable session, then output real-time distance and direction data so that we can use that data to start building code for the Raspberry Pi and the controls on Basket Buddy. Additionally, if I have additional time, I’ll look into how we can integrate LiDAR sensors into our system.
Mobile App Local Network Connection on Two iPhones:

