Audrey’s Status Report for 11/15

This week, I worked on fixing the encoders. I connected the raw motor wires to DuPont-style wire connectors, allowing them to establish a better connection with the breadboard. I wrote a program that utilizes the encoder’s interrupts to calculate the speed and direction of the robot. Unfortunately, I found an issue where only one of the AB phase encoders works; thus, I need to conduct further testing to determine if this issue is specific to one motor or all of them. I also helped other teammates with their work, such as reading into the UWB direction issue. I also worked on the interim demo, ensuring the robot was ready for its first showcase to the world!

In terms of verification for the robot subsystem, I have been extensively testing individual components such as motors, encoders, and communication between microcontrollers or from a microcontroller to a computer. In the upcoming weeks, the robot subsystem will hopefully be fully verified.

 

Overall, I am a week behind compared to the Gantt chart. Due to the encoders not working properly, I will spend the first half of next week working on them. If the issue persists or the encoders don’t work by then, I will turn my focus to other aspects of the project, such as LiDAR integration, to help make faster progress in time for our final demo.

Team Status Report for 11/15

Overall Progress

  • Working on Raspberry Pi 5 setup
  • UWB distance measurement but no direction
  • Encoder support on robot

 

Risks & Management 

A huge risk that our project faces is the UWB direction issue. Without the direction of our robot relative to the user, following the user will be extremely difficult. For example, since the user and the robot can be moving at the same time, the robot could end up moving the opposite direction of the user and end up out of range, leading to frequent failures in the efficiency of movement for our robot. Thus we are hoping to find a solution to the lack of direction between the two phones. We have contacted our professors and TA and are hoping to get some guidance on how we can resolve this issue. We are also looking into using the iPhones cameras to calculate direction if needed.

 

Design Changes & Justification

No design changes have been made.

 

Validation

We have not been able to run any validation tests on our project since we do not have all the parts integrated together. However, once we do, we will analyze the data gathered from our testing plan that we outlined in our design report. For the 30 lb load requirement, we will confirm that the motors can successfully initiate movement, stop safely, and maintain maximum speed at all test weights. For the maximum 4 mph speed requirement, we will reference encoder logs to monitor Basket Buddy’s speed. For the following distance, we will analyze the logged UWB outputs to make sure it falls within our expected 2 ft range with a margin of 6 in. Finally, for our safety requirements, we will analyze the timed results from both the app-stop and out-of-range scenarios, to confirm the motor response time is below the threshold.

Elly’s Status Report for 11/15

This week, I focused on helping with integration for the interim demo. The Raspberry Pi 5 came on Wednesday, so for the rest of the week, I focused on getting it set up with WiFi and the Micro SD card . The Raspberry Pi will connect all the components together, such as the UWB connection and the LiDAR, and communicate that to the Teensy for motor commands. We plan to set up the Raspberry Pi in headless mode by ssh-ing into it, since we only plan to program on it, but we are running into some issues with the WiFi.

In terms of progress, we are still behind in the Gantt Chart but most of the work from here is integration, which I will work with my teammates to get done. We have a slack week which will hopefully be enough for us to catch up on all work before the final demo. Next week, I hope to have the Raspberry Pi completely set up with all the code necessary running on it. 

For verification of the subsystems that I have been working on, I created a simulation to ensure that the pathfinding algorithm was working as expected and made sure to set a 2 feet distance from the shopper in the code. I plan to do more extensive testing for the obstacle detection when we integrate the LiDAR with the motors by marking a circle with a 1-foot radius around an obstacle and having the user walk in the direction of the obstacle while having the cart avoid the marked area. For the mobile app, we kept the user interface simple, with only 2 different pages to ensure that the app was intuitive to use. Finally, we tested how long it took for the product details to populate in the app once the barcode scanned, and it was under 500ms.

Rose’s Status Report for 11/15

This week, I focused on getting our interim demo integrated and working. Since the Raspberry Pi did not arrive in time for the demos, I used my computer as a substitute.  I modified the mobile app to transmit UWB direction and distance data via UDP sockets instead of only printing to the console. I then wrote a Python listener script on my computer that received this UWB data in real time from the iPhones. This script forwarded the received data to another module responsible for communicating with the robot control system running on the Teensy. This pipeline allowed us to successfully demonstrate the system’s end-to-end functionality, showing that UWB data could be transmitted from the iPhones, through a UDP interface, and finally to the robot controller.

Obtaining direction data is an ongoing issue. I did more research this week on it, and I’ve found no real solution. I integrated ARKit for camera assistance.  Sources I read stated “On iPhone, it’s possible to receive direction for nearby third-party accessories in sessions that enable isCameraAssistanceEnabled.” Unfortunately, all this did was make our distance measurements more accurate, without actually enabling direction.

At this stage, I am behind schedule because of the unresolved issue of receiving direction information from the UWB connection.  This problem is software-related and tied to Apple’s Nearby Interaction framework limitations when using camera-assisted direction. To catch up, I plan to continue debugging with the assistance of the professors and TA, explore potential workarounds, and perhaps find a way to do relative device orientation estimates. Once direction data is available, I will quickly integrate it into the UDP communication pipeline and test with the Raspberry Pi.

For verifying the UWB communication and data relay subsystem, I plan to measure UDP packet latency from the iPhone to the Raspberry Pi or computer to ensure responsive data transfer. I will also validate the accuracy of the distance measurements by comparing UWB  distances against physical measurements. Once direction data becomes available, I will conduct angular accuracy tests to quantify directional error. Finally, I will run repeated tests to analyze the overall stability of the system, including packet loss, jitter, and consistency across the full pipeline.

‘direction’ variable documentation: https://developer.apple.com/documentation/nearbyinteraction/ninearbyobject/direction-4qh5w

Rose’s Status Report for 11/8

This week, I focused on debugging the direction measurement functionality for our UWB interaction. Currently, it only outputs distance values, but not direction or position, which is crucial for our demo.

Example output:

After looking into the supportsDirectionMeasurement property, I discovered that it is returning false on both our iPhone 15 Plus and iPhone 16 devices, even though both should theoretically support direction measurements, according to all the documentation I found online. They both have a U2 UWB chip that should support the collection of direction data.

I have spent a lot of time tracing through the entire app to check whether the issue is a result of the iOS version, session configuration, missing permissions, device settings,  or a mismatch in how the capabilities are enabled. 

Unfortunately, the Raspberry Pi 5 (which I ordered two weeks ago and was expected to arrive on Thursday) still has not arrived, so I have been unable to proceed with the full integration and testing. The missing hardware continues to be a major blocker for testing the iPhone receiver connection and for controlling the robot with the recorded data.

At this stage, I am behind schedule due to the ongoing delay in receiving the Raspberry Pi 5. Without it, I can’t run the communication pipeline between the iPhone and Pi. However, I’ve continued to make as much progress as possible on the software debugging side to minimize further delays once the hardware arrives. To catch up, I plan to put in extra hours next week and begin integration as soon as the Raspberry Pi arrives.

Nearby Interaction supportsDirectionMeasurement documentation: https://developer.apple.com/documentation/nearbyinteraction/nidevicecapability/supportsdirectionmeasurement?language=objc.

Team Status Report for 11/8

Overall Progress

  • Created LiDAR simulation for local navigation
  • Worked on motor movement

Risks & Management 

A huge risk that our project is facing is the Raspberry Pi not being here for our interim demo. We discovered that our Raspberry Pi 5 we were using was broken almost 2 weeks ago and promptly placed an order for a new Raspberry Pi 5. Unfortunately we have not received the replacement Raspberry Pi 5 which poses a significant road block to our interim demo since it is the component used for communication between our phones and the Teensy microcontroller. Since the Raspberry Pi 5 won’t be here in time for our demo, we are looking into using a Raspberry Pi 4 or using our computers as a stand in for the demo.  This would be a temporary solution as we still plan on using the Pi 5 for a final demo. 

Design Changes & Justification

No design changes have been made.

Elly’s Status Report for 11/8

This week, I worked on creating a Pygame simulation to observe the behavior of the cart when using the virtual force algorithm. The simulation works by grabbing 20 LiDAR scans of the environment and creating a virtual mapping of the environments. From here, the shopping cart is added into the environment and the computer’s mouse acts as the shopper. While you move the mouse, the shopping cart, which is visualized as a red dot, attempts to follow the mouse and navigate any potential obstacles. When the mouse is far away from the cart, the cart speeds up to catch up to the mouse. For the most part, the virtual force algorithm does a good job of navigating the obstacles, but there are cases in which the cart gets stuck.

I am still behind in progress since we have not been able to integrate the parts together since we still have not received our Raspberry Pi that we ordered. Instead, I have been focusing on trying to get the obstacle detection to work on its own so there will be less debugging when integrating with the motors. Hopefully our Raspberry Pi will be delivered next week so we can make progress with integration.

Demo

Audrey’s Status Report for 11/8

This week, I worked on the robot’s movement. I received new H-Bridges this week that replaced the faulty ones I was using last week. I tested connection points between the Teensy microcontroller and the motors. I got all 4 motors to work synchronously. I wrote functions to make the robot move in various directions. I tested the motors’ encoders and found that there was an error with them.  I believe it was due to the connection between the Teensy microcontroller and the motors, so I ordered a Dupont connector kit that I hope will solve this issue. I also wrote code that handles interrupts and reads commands from a USB device to control the movements of the robot. This mimics the communication between the Raspberry Pi 5 and Teensy controller, considering we don’t have a Raspberry Pi 5 yet. 

 

Overall, I am a week behind compared to the Gantt chart. I realized that encoders and PID control aren’t needed for the interim demo, so I am focusing on other aspects of the project in order to be demo-ready. Next week, I will work on presenting our interim demo, along with regaining progress when the Raspberry Pi 5 arrives, in regards to the communication between the microcontrollers. Also, I plan on figuring out the issue with the encoders.

Rose’s Status Report for 11/1

This week, I worked on connecting the receiver iPhone to the Raspberry Pi to begin transferring UWB position data to the cart’s control system. Unfortunately, I discovered that the Raspberry Pi 5 we borrowed from the ECE inventory has a broken SD card holder, which prevents it from booting or running any code. It had old solder on the pins, which likely means that others have also previously tried to fix it with no success. I immediately ordered a replacement that should arrive next week (Amazon Prime). In the meantime, I decided to make more progress with the software side. I wrote the Flask server code that will run on the Raspberry Pi to handle incoming data from the receiver iPhone. Although the code has not been tested yet due to the hardware issue, it is fully written and ready for deployment (and testing) once the new Pi arrives. 

I also spent some time researching the trade-offs between using Bluetooth and HTTP as the communication method between the receiver iPhone and the Raspberry Pi. I found that HTTP over Wi-Fi has higher data throughput and stability for streaming continuous position updates, while Bluetooth has lower power consumption and a simpler pairing method, as it does not rely on a Wi-Fi connection. However, Bluetooth has limited bandwidth and higher latency, which makes it less reliable for real-time data updates. Based on this, I decided that using HTTP over Wi-Fi is ultimately the more practical choice for our project at this stage.

Overall, I remain behind compared to the Gantt chart because the broken Raspberry Pi delayed my progress. Next week, once the new Raspberry Pi arrives, I plan to set up the Flask server, test end-to-end communication between the iPhone and Raspberry Pi, and look into how we can translate those coordinates into motion controls without obstacle avoidance for a preliminary prototype. My goal is to catch up on lost time and have the full UWB-to-cart communication working for the interim demo.

Team Status Report for 11/1

Overall Progress

  • Tested LiDAR sensor
  • Tested H-Bridges for motor control

Risks & Management 

One of the risks for the project is the issue with the H-Bridges. Since they were received as faulty, we were unable to test the motors and make significant progress on the robot’s movement. Thus, we bought new H-Bridges to replace the faulty ones and are looking into getting those H-Bridges refunded. If these new H-Bridges don’t work as intended, we are considering buying a robot car kit online to remove the risk of compatibility issues between components, so that our Interim Demo can show the code’s progress.

Additionally, the Raspberry Pi 5 we borrowed from the ECE inventory had a broken SD card holder, which made it so that we were unable to test the receiver iPhone to Raspberry Pi connection. To manage this, we ordered a new Raspberry Pi 5 immediately and continued to work on the software part of the communication. If the new Raspberry Pi experiences delays or if additional hardware issues arise, we can temporarily use a laptop-based server for testing.

Design Changes & Justification

In addition to using D* Lite for pathfinding, we also decided to test out the virtual force algorithm for local navigation as a way to avoid obstacles. This algorithm uses an attractive vector pointing from the robot to the next position and a repulsive vector pointing from the obstacle to the robot. These two vectors are added up to create the direction of motion. We decided to use this because it provides a reactive, real-time, obstacle avoidance. D* Lite by itself only provides the long-term goal of getting the cart to the user, rather than the short-term motions.