Team Status Report for 10/18

Overall Progress

  • Finished design report
  • Finalized implementation details for project
  • Ordered robot car kit

Risks & Management 

The design report took longer than expected, causing all of us to be behind in our work as we dedicated last week to the report. Therefore, we will have to make up for last week’s work this week. 

Due to a communication error, we realized that our robot car kit, Teensy 4.1 microcontroller, and H-Bridges were never ordered. We found this out on Wednesday and promptly ordered the parts the same day. Unfortunately, due to Fall Break, all group members were out of town when the parts arrived. Thus, we have been unable to make progress on the hardware aspect of this project. This communication mistake poses a significant risk to the overall Gantt Chart as it will delay the work by a whole week. However, we scheduled a slack week right before the interim demo, so as long as we continue to stick to our schedule, we should have a working prototype for the interim demo.

Design Changes & Justification

There were no changes on the hardware side, but on the software side, we decided to use a local database for product barcodes, names, and prices instead of an API. This change was necessary to improve the user experience and does not incur any additional costs.

Additionally, all hardware and software connection methods have been researched and are detailed in our design report. We have collaboratively created this block diagram:

Product Solution Meeting Needs

A was written by Rose, B was written by Audrey, and C was written by Elly.

Part A: Global Factors

Around the world, grocery shopping can be physically demanding and time-consuming, especially for people with limited mobility, older adults, or parents managing children while they shop. Basket Buddy addresses this by reducing the effort needed to push and steer a cart, making shopping more convenient and enjoyable for everyone.

In addition to making shopping more accessible, Basket Buddy also fits into the global trend of automation and smart retail technology. As more stores move toward self-checkout and contactless shopping, our project contributes to this shift by introducing a smart, user-friendly cart that improves both convenience and safety. With UWB and LiDAR-based navigation, it should have reliable performance in many types of indoor retail environments, making it adaptable to stores of different layouts and sizes around the world.

Part B: Cultural Factors

Basket Buddy showcases cultural values such as independence, community care, and accessibility. Many cultures view assisting others as a moral good; Basket Buddy supports this viewpoint through assisting people with their shopping. Basket Buddy also reflects cultural beliefs in equality and inclusion, allowing everyone to shop without stigma or dependence on others. Additionally, through the built-in safety measures, Basket Buddy respects both the moral and legal expectations of being safe in most communities.

Part C: Environmental Factors

The environmental impact of Basket Buddy is mainly in the manufacturing and energy consumption of the cart. Unlike a standard shopping cart, Basket Buddy requires additional electronics, sensors, motors, and batteries. The manufacturing process of these components is resource-intensive, and relies on the sourcing of more materials and an increased carbon footprint compared to a traditional cart. Furthermore, if a company wanted to incorporate Basket Buddy into its grocery stores by replacing traditional shopping carts, the company would need to use a lot of energy to maintain the carts. Basket Buddy would require constant charging, becoming a significant consumer of electricity, or new parts if electronic components break down, causing more waste to be produced. If Basket Buddy were to be adopted by stores, these stores would need a plan for charging and maintenance to prevent carts from producing excess waste.

However, Basket Buddy also provides the opportunity for people to be more environmentally conscious about their purchases. Being able to view your items in the mobile app can guide shoppers to make more sustainable choices, and a digital checkout process eliminates the need for paper receipts. More features can also be added to Basket Buddy such as expiration dates to prevent food waste or highlighting products with eco-friendly packaging or local sourcing.

Rose’s Status Report for 10/18

This week, I primarily worked on finishing the design report with my teammates and finalizing several of the software-side design choices related to the UWB functionality. I decided that the best way to send information from the iPhone receiver to the Raspberry Pi would be through a local Wi-Fi HTTP bridge. In this setup, the Pi runs a lightweight HTTP server on the local network, and the mobile app on the receiver’s iPhone sends JSON packets containing UWB position data using HTTP requests.

I also conducted additional research into LiDAR drivers for the Raspberry Pi to understand better how sensor data will be processed in our system. Our RPLIDAR will connect to the Raspberry Pi via USB, and its driver allows the Pi to read and process distance and angle data from the sensor. This data is then converted into a 2D map showing nearby obstacles and open spaces. I explored various software options for running the RPLIDAR on the Raspberry Pi, including the official SDK, a lightweight Python library, and a ROS-compatible driver for real-time mapping. This research helped me better understand how the LiDAR subsystem will integrate with the rest of our navigation and obstacle-avoidance software.

According to our Gantt chart, my progress is behind schedule, as the goal for this stage was to finalize our mobile app and have the beginnings of shopper tracking over UWB. I was unable to dedicate as much time to these goals since we were working extensively on the design report. 

To get back on track, I plan to finalize the mobile app next week and ensure the UWB session is fully functional between two iPhones. I will also focus on establishing the connection between the receiver iPhone and the Raspberry Pi, which involves setting up the Wi-Fi HTTP server on the Pi and linking the iPhone to it. Once this is in place, we’ll have a clearer understanding of how the data transmission and processing will work within the overall system.

Audrey’s Status Report for 10/18

This week, I focused on the design report. I finalized all aspects of communication and physical connections between all hardware components on the robot. I also roughly laid out where all hardware components will be on the robot in the robot’s block diagram. I redid all calculations for numbers in the design report, and I rechecked these numbers with my previous work I used in the design presentation.

Due to the robot car kit, Teensy 4.1 microcontroller, and H-Bridges being ordered last week and the following week being a break, I am behind. This upcoming week, I plan on working extra hard to finish assembling the robot car kit and finishing the core programming of the robot. This includes the communication between the Teensy 4.1 and Raspberry Pi (UART communication) and the naive movement of the wheels (PWM module). I plan on pushing the encoder module and PID control back a week since it isn’t a part of the key functionality of the robot car. Pushing these modules back a week ensures that Rose and Elly can work on their parts without falling behind due to my error.

Elly’s Status Report for 10/18

I worked on finishing the design report for this week and finalizing design choices and implementation details for the software. We decided to use a local database with Apple’s Core Data framework instead of the Open Food Facts API because having a database without the proper product name and price would lead to a bad user experience as certain details could be missing if we rely on an open source database. Thus we chose to create our own database with products from a local grocery store or an online store such as Amazon to simulate the database that a grocery store would have. Another detail we finalized was the pathfinding. We will be using D* Lite due to its strengths in dynamic environments and its simplicity compared to D*. The LiDAR coordinates will be converted into a cost matrix that will be fed into the algorithm as an input. In terms of resources, there is pseudocode and a few Python implementations of the algorithm that we will be referencing for this project.

I am behind schedule as last week as I planned to have the mobile app finalized, but I was unable to work on it due to finishing up the design report. To get back on track, I plan to spend this week finishing up the mobile app by creating the local database and finalizing the UI of the app. Unfortunately, this means that I will need to push back the implementation of LiDAR and obstacle detection, but since there is a slack week, I should be able to get it done by the interim demo.

Team Status Report for 10/4

Overall Progress

  • Ordered all parts for project
  • Received Raspberry Pi 5 and LiDAR Sensor
  • Barcode Scanner and OpenFoodFacts API integration into mobile app
  • Started UWB location tracking between two Apple devices
  • Set up Raspberry Pi 5 and VS Code remote SSH

Risks & Management 

One potential risk is the parts not coming in time. Since a lot of our project relies on the hardware working, if we do not have enough time to ensure accuracy and reliability within our hardware, the rest of our project’s timeline can fall behind. Thus, we hope the parts will arrive sometime next week, hopefully early next week. That being said, if parts do not show up on time, we will try to set up and test our hardware systems so that they are more reliable or ready when the parts do arrive. 

Design Changes & Justification

Earlier this week, we realized that the third-party UWB module we wanted to use would not work unless we sign up for the Apple MFI Program. The UWB module is compatible with Apple devices, but the issue is that we wanted to send information about the iPhone’s location to the Raspberry Pi. Apple does not allow you to send information from its devices to third-party devices unless you are a part of the MFI Program. This program is for engineers who are looking to mass-produce a product, which does not apply to us. After consulting with Professor Spielberg, he mentioned that we could use another iPhone in place of the UWB module since we believe that sending location information between iPhones should be allowed. He also mentioned that we should try and talk to other faculty to see if they are a part of the MFI Program. Using another iPhone does not increase our costs, but if Basket Buddy were to be a real product, it would be an issue.

After feedback and some discussion, we decided to remove the macro-tracking aspect of our mobile app. We removed this to clarify our use case and user. Overall, we realized the nutritional information didn’t add significant value to our project and made the point of our project more muddied. Thus, we pinpointed that Basket Buddy is for shoppers who want a convenient shopping experience, rather than health-conscious shoppers.

Rose’s Status Report for 10/04

This week, I focused on integrating UWB technologies into our iOS app to enable shopper location tracking between devices. I set up the Nearby Interaction framework in Xcode, adding the required capabilities and permissions (Nearby Interaction, Local Network Usage). Then, I implemented code to initialize an NISession and use MultipeerConnectivity to exchange discovery tokens between iPhones. This allows two devices to establish a UWB session and start getting distance and direction information, acting as the foundation for the indoor location tracking we are aiming for.

I also built out the flow within the app so that tapping the “Connect” button on the Welcome screen initializes the UWB session. After this, I deployed the app onto two iPhones. The app now builds and runs on the two devices, simultaneously advertising and browsing for peers. Terminal and Xcode logs confirm that peer discovery and UWB session initialization attempts are working, meaning I’ve created that initial communication layer.

According to our Gantt chart, my progress is on schedule, as the goal for this stage was to establish a working UWB connection between two devices.

Next week, I plan to finalize the app and get the UWB session fully functional between two iPhones. My immediate goal is to have the devices successfully exchange discovery tokens and run a stable session, then output real-time distance and direction data so that we can use that data to start building code for the Raspberry Pi and the controls on Basket Buddy. Additionally, if I have additional time, I’ll look into how we can integrate LiDAR sensors into our system. 

Mobile App Local Network Connection on Two iPhones:

Audrey’s Status Report for 10/4

This week, I set up the Raspberry Pi 5. We realized the Raspberry Pi 5 kit we got didn’t include a micro SD card, so I placed an order for one, but in the meantime, I borrowed a friend’s micro SD card. Since the Raspberry Pi 5 doesn’t naturally support VS Code, I looked into remote-SSHing into the Raspberry Pi 5 to upload code and got that system working. Since I am mostly waiting on parts to arrive to continue working, I also looked more into the compatibility of other parts and more about the Teensy 4.1 firmware/manual pages. Due to the majority of my work needing to be done next week, I tried to offload some of the work by looking into any potential issues I might face ahead of time. I also presented the Design Presentation to the class and reviewed the feedback from the Q&A section we received. 

Since this week is mostly dedicated to early software development and the Design Presentation, the hardware/firmware section I am working on is still on track for the project schedule.

Next week, I hope to build the basics for the robot car kit. I also hope to get the motors to start moving so that the car can move forward and backward, and potentially side to side or turning.

Elly’s Status Report for 10/4

I worked on implementing a barcode scanner for the mobile app and connected the barcode scanner output to the OpenFoodFacts database via their API to get the title of the product. The barcode scanner uses the AV Foundation framework, allowing me to create a ViewController that manages an AVCaptureSession. This allows the user to give permission to use their camera and scan the barcode. This outputs a string of numbers, which can be used to make a request to OpenFoodFacts to search for the product name. The product name will be populated in a list that the user can view. One issue with the barcode scanner that needs to be fixed is that the list of items disappears when the user exits the app, so I need to implement a state that allows the user to keep their current session. Another issue is that OpenFoodFacts does not provide the price of the product since this is something that varies by store. Thus, we need to decide how to set the price for each product and whether OpenFoodFacts is appropriate for our project or if we should create our own database.

I also worked on researching pathfinding algorithms for our shopping cart. Last week, I determined a potential obstacle avoidance algorithm that we could use, but in order for the cart to follow the user, we decided to use D*. This is because D* is designed for changing environments, so if the cart detects a new obstacle via LiDAR, this algorithm can replan that path without recalculating the entire path. 

Next week, I plan to fix the issues I mentioned above with the mobile app and get a fully functional mobile app. If there is additional time, I hope to start doing research on LiDAR as we have the sensor and see what kind of information we can get from it.

Barcode Scanner Resources:

https://medium.com/@jpmtech/how-to-make-a-qr-or-barcode-scanner-in-swiftui-68d8dae8e908 

https://medium.com/@ios_guru/swiftui-and-custom-barcode-scanner-f3daaeabfcea 

Barcode Scanner Demo:

https://vimeo.com/1124535079?share=copy 

D* Resources:

https://engineering.miko.ai/path-planning-algorithms-a-comparative-study-between-a-and-d-lite-01133b28b8b4 

https://www.ri.cmu.edu/pub_files/pub3/stentz_anthony__tony__1994_2/stentz_anthony__tony__1994_2.pdf 

Rose’s Status Report for 9/27

This week, I created the Figma mock-ups for the mobile app, expanding on the initial layout Elly developed last week, and used those designs as the basis for the app framework in Xcode. I now have the app running on my personal device in developer mode. For this, I learned the basics of SwiftUI, and I also explored how Apple’s Nearby Interaction framework can be integrated to support UWB-based tracking. I also began researching how to add a barcode-scanning feature, which will require using the phone’s camera and requesting the appropriate permissions.

My progress is on schedule and slightly ahead of plan. I have already started experimenting with the barcode-scanning functionality earlier than expected, having gotten camera permissions and am now preparing to find ways to have the app read barcodes.

By next week, we will be done with our design presentation and will have chosen all of our parts. I aim to have the barcode scanner working within the app and to complete a more detailed UI that incorporates all of the elements outlined in the Figma prototypes. Although the development of the Nearby Interaction feature depends on the arrival of the UWB module and Raspberry Pi, there is still plenty of mobile-app work to finish first, so this dependency is not a concern at this stage.

Figma Design:  https://www.figma.com/design/ilDMCTtHt1EF02QuBBTmXb/Basket-Buddy?node-id=0-1&t=qWuDfUQTPLnCH1fg-1

Mobile App: https://vimeo.com/1122556942?share=copy

Mobile App Repository: https://github.com/rosel26/basket-buddy

Elly’s Status Report for 9/27

This week I worked on expanding on the design of our project in preparation for the design presentation. Since we decided that Bluetooth and GPS would not be a viable solution for location tracking, I helped research other methods of location tracking, and our team decided to use UWB. From there, I looked into Apple’s Nearby Interaction framework which supports UWB and picked out our UWB module (DWM3001CTR13) that would be placed on the cart. The module and the iPhone allow for UWB connection which can be used to ensure that the shopping cart stays within range of the shopper. 

I also researched how to integrate LiDAR and pathfinding for our cart. We decided to use a 2D 360 LiDAR sensor connected to a Raspberry Pi 5. From there, I looked into how we could get the information from that LiDAR sensor and combine it with the information from the UWB module to direct the movement of the cart. Robot Operating System (ROS) is a popular framework for building robot applications which we plan on using, but there may be issues with using it as it may not operate well on Macs. I read a research paper called “Development of Human Following Mobile Robot System Using Laser Range Scanner” which details how the group created a human following robot with LiDAR. In this paper, they mention the collision avoidance function they used, which outlines a collision detection area. If there is an obstacle in the collision detection area, the obstacle has its own circular collision avoidance area. The function then calculates the collision avoidance vector which is tangent to the circular collision avoidance area and in the same direction as the shopper’s direction. This vector is the direction in which the robot will travel. Since this is very similar to our project, we plan on using a similar function for our cart, integrating the coordinates of the shopper that we get from UWB.

We are currently on track for this week since most of the focus is on figuring out the design and design presentation. Although the function from the research article is a good option, there are more pathfinding algorithms that can be explored, so for next week, I plan on having a finalized pathfinding algorithm.