This week, I made a lot of progress on the navigation functionality and determining the direction/relative angle from one iPhone to the other (required for enabling the robot to follow the person). After several attempts to use UWB direction vectors (described in previous status reports), I implemented a GPS-based solution. Each iPhone gathers its own GPS coordinates and compass heading while also receiving the peer device’s GPS data. Using these values, I compute the bearing from my phone to the peer phone and then subtract my own heading to obtain an angle between -180° and 180°. This relative angle is packaged into a JSON message and sent through a UDP pipeline to the Raspberry Pi (currently my laptop during testing). On the Python side, I updated the listener to parse these new messages, pass the values to the robot controller, and also filter incoming packets to ensure that only commands from the intended iPhone are used. I integrated this with the whole robot system, watched the motors turn left and right based on the iPhones’ positions and read the live terminal logs, which show the relative angle updating in real time. A reason I had not used GPS earlier is that it is inaccurate indoors. Based on the terminal logs, it is, on average, accurate within 5-10 meters. For now, this will need to work.
Throughout this project, the three main things I had to learn were iOS development, UWB, and networking. I learned Swift and how to build and deploy apps to physical iPhones, including Xcode setup, device provisioning, and handling permissions. I also extensively researched how Apple’s UWB and Nearby Interaction frameworks work, how they provide distance and direction, and what limitations they have. On the networking side, I studied Bluetooth, TCP, UDP, and IP to figure out how to connect the iPhones to the Raspberry Pi. I decided on a UDP-based pipeline, which required learning sockets, JSON message formats, and data handling. I also experimented with small Flask servers earlier on. Most of this learning came from Apple documentation, online tutorials, open-source examples, and hands-on debugging on real devices, including reading logs and iterating until everything worked.
I’m currently behind schedule because determining the direction between the two iPhones took much longer than expected due to numerous issues (described in previous status reports). To catch up, I’m focusing on making the GPS-based navigation pipeline more stable instead of trying to debug UWB. My immediate goals are to smooth the relative-angle values, reduce jitter, and clean up noisy GPS input so the robot receives more consistent turning commands.
Next week, I want to add some kind of filtering to the relative angle calculations, either with a moving average or a Kalman filter and work with my teammates to improve the robot’s control logic so movements are smoother and less jerky. If time permits, I may revisit UWB to see whether I can troubleshoot it further or if its data can be combined with GPS for better accuracy, but the primary objective is to finalize a reliable, functional navigation system based on the GPS progress I achieved.
