Audrey’s Status Report for 11/22

This week, I worked on fixing the encoders. I tested the encoders and found out that half of the encoders don’t work. I believe this is due to accidentally frying them when I was first testing them, and I plugged the power into the ground and ground into power. Since the internal encoders are broken, I decided to scrap using encoders for this project due to the upcoming final demo and because the encoders are built into the motors, meaning I would need to order new ones, which would likely not come in time or leave me with little to no time to integrate them into the project. I also helped other teammates with their work, such as helping with testing for the UWB direction, understanding the LiDAR test code, and integrating LiDAR obstacle detection into the project for initial testing. 

From this project, I’ve learned how to use Arduino microcontrollers, expanded my knowledge of embedded systems, and learned new technical details such as how LiDAR sensors work, Apple frameworks, and pathfinding. These skills were learned through researching them online, watching tutorials, and extensive debugging.

 

Overall, I am a bit behind compared to the Gantt chart. Luckily, due to the allocated slack week and the Thanksgiving break, I will be able to dedicate myself to progressing the project so that it can be ready for the final demo. In this upcoming week, I plan on doing the final wiring for the robot, attaching the shopping basket to the robot, and helping with final testing.

Rose’s Status Report for 11/22

This week, I made a lot of progress on the navigation functionality and determining the direction/relative angle from one iPhone to the other (required for enabling the robot to follow the person). After several attempts to use UWB direction vectors (described in previous status reports), I implemented a GPS-based solution. Each iPhone gathers its own GPS coordinates and compass heading while also receiving the peer device’s GPS data. Using these values, I compute the bearing from my phone to the peer phone and then subtract my own heading to obtain an angle between -180° and 180°. This relative angle is packaged into a JSON message and sent through a UDP pipeline to the Raspberry Pi (currently my laptop during testing). On the Python side, I updated the listener to parse these new messages, pass the values to the robot controller, and also filter incoming packets to ensure that only commands from the intended iPhone are used. I integrated this with the whole robot system, watched the motors turn left and right based on the iPhones’ positions and read the live terminal logs, which show the relative angle updating in real time. A reason I had not used GPS earlier is that it is inaccurate indoors. Based on the terminal logs, it is, on average, accurate within 5-10 meters. For now, this will need to work. 

Throughout this project, the three main things I had to learn were iOS development, UWB, and networking. I learned Swift and how to build and deploy apps to physical iPhones, including Xcode setup, device provisioning, and handling permissions. I also extensively researched how Apple’s UWB and Nearby Interaction frameworks work, how they provide distance and direction, and what limitations they have. On the networking side, I studied Bluetooth, TCP, UDP, and IP to figure out how to connect the iPhones to the Raspberry Pi. I decided on a UDP-based pipeline, which required learning sockets, JSON message formats, and data handling. I also experimented with small Flask servers earlier on. Most of this learning came from Apple documentation, online tutorials, open-source examples, and hands-on debugging on real devices, including reading logs and iterating until everything worked.

I’m currently behind schedule because determining the direction between the two iPhones took much longer than expected due to numerous issues (described in previous status reports). To catch up, I’m focusing on making the GPS-based navigation pipeline more stable instead of trying to debug UWB. My immediate goals are to smooth the relative-angle values, reduce jitter, and clean up noisy GPS input so the robot receives more consistent turning commands.

Next week, I want to add some kind of filtering to the relative angle calculations, either with a moving average or a Kalman filter and work with my teammates to improve the robot’s control logic so movements are smoother and less jerky. If time permits, I may revisit UWB to see whether I can troubleshoot it further or if its data can be combined with GPS for better accuracy, but the primary objective is to finalize a reliable, functional navigation system based on the GPS progress I achieved.

Team Status Report for 11/22

Overall Progress

  • UWB direction
  • Began integrating LiDAR into robot movement
  • Began integrating UWB into robot movement

Risks & Management 

The most significant risk that our project faces is the upcoming deadline of the final demo and final presentation at the end of next week. We spent a lot of time trying to figure out how to obtain a direction value/relative angle, which caused us to fall behind on integration and testing.  Thus, we plan to spend as much time as possible integrating all of the parts together for this upcoming week.

Design Changes & Justification

Since the direction through UWB was not working, we switched to using GPS and the iPhones’ compass headings to calculate the direction of the phones relative to each other. This design change was not ideal, as GPS is not very accurate indoors, but calculating the direction based on GPS is better than not having a direction at all.

Elly’s Status Report for 11/22

This week, I finished setting up the Raspberry Pi by uploading all the code to the microcontroller and started integrating the subsystems together. I spent some time refining the obstacle detection algorithm to ensure that it would turn in the correct direction if there was an obstacle on the left or right. We decided to only focus on obstacles in front of the cart, so I made sure to filter out any obstacles that are detected behind the LiDAR. We also started integrating all the components together. We hooked up the Arduino and the Raspberry Pi together to send the commands based on the obstacle detection algorithm, however we are running into issues with the connection, which we plan to smooth out next week.

I am still behind based on the Gantt Chart, but will plan to spend as much time as possible to finish up integrating all the parts. Next week, I plan to have the LiDAR fully integrated with the motors and be able to have very basic obstacle detection working.

Throughout this project, I’ve learned a lot about LiDAR, obstacle detection, and pathfinding algorithms. This was my first time using pathfinding algorithms, and I had the chance to learn the differences between global navigation and local navigation, and exploring different options like A Star and D Star, while evaluating the tradeoffs on what would work best for our project. To learn this information, I read a lot of articles on pathfinding algorithms and watched videos of people creating similar autonomous following projects to get an idea of what technologies were available for us to use.

Audrey’s Status Report for 11/15

This week, I worked on fixing the encoders. I connected the raw motor wires to DuPont-style wire connectors, allowing them to establish a better connection with the breadboard. I wrote a program that utilizes the encoder’s interrupts to calculate the speed and direction of the robot. Unfortunately, I found an issue where only one of the AB phase encoders works; thus, I need to conduct further testing to determine if this issue is specific to one motor or all of them. I also helped other teammates with their work, such as reading into the UWB direction issue. I also worked on the interim demo, ensuring the robot was ready for its first showcase to the world!

In terms of verification for the robot subsystem, I have been extensively testing individual components such as motors, encoders, and communication between microcontrollers or from a microcontroller to a computer. In the upcoming weeks, the robot subsystem will hopefully be fully verified.

 

Overall, I am a week behind compared to the Gantt chart. Due to the encoders not working properly, I will spend the first half of next week working on them. If the issue persists or the encoders don’t work by then, I will turn my focus to other aspects of the project, such as LiDAR integration, to help make faster progress in time for our final demo.

Team Status Report for 11/15

Overall Progress

  • Working on Raspberry Pi 5 setup
  • UWB distance measurement but no direction
  • Encoder support on robot

 

Risks & Management 

A huge risk that our project faces is the UWB direction issue. Without the direction of our robot relative to the user, following the user will be extremely difficult. For example, since the user and the robot can be moving at the same time, the robot could end up moving the opposite direction of the user and end up out of range, leading to frequent failures in the efficiency of movement for our robot. Thus we are hoping to find a solution to the lack of direction between the two phones. We have contacted our professors and TA and are hoping to get some guidance on how we can resolve this issue. We are also looking into using the iPhones cameras to calculate direction if needed.

 

Design Changes & Justification

No design changes have been made.

 

Validation

We have not been able to run any validation tests on our project since we do not have all the parts integrated together. However, once we do, we will analyze the data gathered from our testing plan that we outlined in our design report. For the 30 lb load requirement, we will confirm that the motors can successfully initiate movement, stop safely, and maintain maximum speed at all test weights. For the maximum 4 mph speed requirement, we will reference encoder logs to monitor Basket Buddy’s speed. For the following distance, we will analyze the logged UWB outputs to make sure it falls within our expected 2 ft range with a margin of 6 in. Finally, for our safety requirements, we will analyze the timed results from both the app-stop and out-of-range scenarios, to confirm the motor response time is below the threshold.

Elly’s Status Report for 11/15

This week, I focused on helping with integration for the interim demo. The Raspberry Pi 5 came on Wednesday, so for the rest of the week, I focused on getting it set up with WiFi and the Micro SD card . The Raspberry Pi will connect all the components together, such as the UWB connection and the LiDAR, and communicate that to the Teensy for motor commands. We plan to set up the Raspberry Pi in headless mode by ssh-ing into it, since we only plan to program on it, but we are running into some issues with the WiFi.

In terms of progress, we are still behind in the Gantt Chart but most of the work from here is integration, which I will work with my teammates to get done. We have a slack week which will hopefully be enough for us to catch up on all work before the final demo. Next week, I hope to have the Raspberry Pi completely set up with all the code necessary running on it. 

For verification of the subsystems that I have been working on, I created a simulation to ensure that the pathfinding algorithm was working as expected and made sure to set a 2 feet distance from the shopper in the code. I plan to do more extensive testing for the obstacle detection when we integrate the LiDAR with the motors by marking a circle with a 1-foot radius around an obstacle and having the user walk in the direction of the obstacle while having the cart avoid the marked area. For the mobile app, we kept the user interface simple, with only 2 different pages to ensure that the app was intuitive to use. Finally, we tested how long it took for the product details to populate in the app once the barcode scanned, and it was under 500ms.

Rose’s Status Report for 11/15

This week, I focused on getting our interim demo integrated and working. Since the Raspberry Pi did not arrive in time for the demos, I used my computer as a substitute.  I modified the mobile app to transmit UWB direction and distance data via UDP sockets instead of only printing to the console. I then wrote a Python listener script on my computer that received this UWB data in real time from the iPhones. This script forwarded the received data to another module responsible for communicating with the robot control system running on the Teensy. This pipeline allowed us to successfully demonstrate the system’s end-to-end functionality, showing that UWB data could be transmitted from the iPhones, through a UDP interface, and finally to the robot controller.

Obtaining direction data is an ongoing issue. I did more research this week on it, and I’ve found no real solution. I integrated ARKit for camera assistance.  Sources I read stated “On iPhone, it’s possible to receive direction for nearby third-party accessories in sessions that enable isCameraAssistanceEnabled.” Unfortunately, all this did was make our distance measurements more accurate, without actually enabling direction.

At this stage, I am behind schedule because of the unresolved issue of receiving direction information from the UWB connection.  This problem is software-related and tied to Apple’s Nearby Interaction framework limitations when using camera-assisted direction. To catch up, I plan to continue debugging with the assistance of the professors and TA, explore potential workarounds, and perhaps find a way to do relative device orientation estimates. Once direction data is available, I will quickly integrate it into the UDP communication pipeline and test with the Raspberry Pi.

For verifying the UWB communication and data relay subsystem, I plan to measure UDP packet latency from the iPhone to the Raspberry Pi or computer to ensure responsive data transfer. I will also validate the accuracy of the distance measurements by comparing UWB  distances against physical measurements. Once direction data becomes available, I will conduct angular accuracy tests to quantify directional error. Finally, I will run repeated tests to analyze the overall stability of the system, including packet loss, jitter, and consistency across the full pipeline.

‘direction’ variable documentation: https://developer.apple.com/documentation/nearbyinteraction/ninearbyobject/direction-4qh5w

Rose’s Status Report for 11/8

This week, I focused on debugging the direction measurement functionality for our UWB interaction. Currently, it only outputs distance values, but not direction or position, which is crucial for our demo.

Example output:

After looking into the supportsDirectionMeasurement property, I discovered that it is returning false on both our iPhone 15 Plus and iPhone 16 devices, even though both should theoretically support direction measurements, according to all the documentation I found online. They both have a U2 UWB chip that should support the collection of direction data.

I have spent a lot of time tracing through the entire app to check whether the issue is a result of the iOS version, session configuration, missing permissions, device settings,  or a mismatch in how the capabilities are enabled. 

Unfortunately, the Raspberry Pi 5 (which I ordered two weeks ago and was expected to arrive on Thursday) still has not arrived, so I have been unable to proceed with the full integration and testing. The missing hardware continues to be a major blocker for testing the iPhone receiver connection and for controlling the robot with the recorded data.

At this stage, I am behind schedule due to the ongoing delay in receiving the Raspberry Pi 5. Without it, I can’t run the communication pipeline between the iPhone and Pi. However, I’ve continued to make as much progress as possible on the software debugging side to minimize further delays once the hardware arrives. To catch up, I plan to put in extra hours next week and begin integration as soon as the Raspberry Pi arrives.

Nearby Interaction supportsDirectionMeasurement documentation: https://developer.apple.com/documentation/nearbyinteraction/nidevicecapability/supportsdirectionmeasurement?language=objc.

Team Status Report for 11/8

Overall Progress

  • Created LiDAR simulation for local navigation
  • Worked on motor movement

Risks & Management 

A huge risk that our project is facing is the Raspberry Pi not being here for our interim demo. We discovered that our Raspberry Pi 5 we were using was broken almost 2 weeks ago and promptly placed an order for a new Raspberry Pi 5. Unfortunately we have not received the replacement Raspberry Pi 5 which poses a significant road block to our interim demo since it is the component used for communication between our phones and the Teensy microcontroller. Since the Raspberry Pi 5 won’t be here in time for our demo, we are looking into using a Raspberry Pi 4 or using our computers as a stand in for the demo.  This would be a temporary solution as we still plan on using the Pi 5 for a final demo. 

Design Changes & Justification

No design changes have been made.