Rose’s Status Report for 9/27

This week, I created the Figma mock-ups for the mobile app, expanding on the initial layout Elly developed last week, and used those designs as the basis for the app framework in Xcode. I now have the app running on my personal device in developer mode. For this, I learned the basics of SwiftUI, and I also explored how Apple’s Nearby Interaction framework can be integrated to support UWB-based tracking. I also began researching how to add a barcode-scanning feature, which will require using the phone’s camera and requesting the appropriate permissions.

My progress is on schedule and slightly ahead of plan. I have already started experimenting with the barcode-scanning functionality earlier than expected, having gotten camera permissions and am now preparing to find ways to have the app read barcodes.

By next week, we will be done with our design presentation and will have chosen all of our parts. I aim to have the barcode scanner working within the app and to complete a more detailed UI that incorporates all of the elements outlined in the Figma prototypes. Although the development of the Nearby Interaction feature depends on the arrival of the UWB module and Raspberry Pi, there is still plenty of mobile-app work to finish first, so this dependency is not a concern at this stage.

Figma Design:  https://www.figma.com/design/ilDMCTtHt1EF02QuBBTmXb/Basket-Buddy?node-id=0-1&t=qWuDfUQTPLnCH1fg-1

Mobile App: https://vimeo.com/1122556942?share=copy

Mobile App Repository: https://github.com/rosel26/basket-buddy

Elly’s Status Report for 9/27

This week I worked on expanding on the design of our project in preparation for the design presentation. Since we decided that Bluetooth and GPS would not be a viable solution for location tracking, I helped research other methods of location tracking, and our team decided to use UWB. From there, I looked into Apple’s Nearby Interaction framework which supports UWB and picked out our UWB module (DWM3001CTR13) that would be placed on the cart. The module and the iPhone allow for UWB connection which can be used to ensure that the shopping cart stays within range of the shopper. 

I also researched how to integrate LiDAR and pathfinding for our cart. We decided to use a 2D 360 LiDAR sensor connected to a Raspberry Pi 5. From there, I looked into how we could get the information from that LiDAR sensor and combine it with the information from the UWB module to direct the movement of the cart. Robot Operating System (ROS) is a popular framework for building robot applications which we plan on using, but there may be issues with using it as it may not operate well on Macs. I read a research paper called “Development of Human Following Mobile Robot System Using Laser Range Scanner” which details how the group created a human following robot with LiDAR. In this paper, they mention the collision avoidance function they used, which outlines a collision detection area. If there is an obstacle in the collision detection area, the obstacle has its own circular collision avoidance area. The function then calculates the collision avoidance vector which is tangent to the circular collision avoidance area and in the same direction as the shopper’s direction. This vector is the direction in which the robot will travel. Since this is very similar to our project, we plan on using a similar function for our cart, integrating the coordinates of the shopper that we get from UWB.

We are currently on track for this week since most of the focus is on figuring out the design and design presentation. Although the function from the research article is a good option, there are more pathfinding algorithms that can be explored, so for next week, I plan on having a finalized pathfinding algorithm.

Team Status Report for 9/27

Overall Progress

  • Created our Project Design Slides
  • Mobile app initial framework created and basic UI is in place
  • Picked out parts for TA approval and review

Risks & Management 

A major risk that has come up this week is that GPS does not work accurately indoors. We have verified this feedback and have now chosen to use a different shopper-tracking mechanism: Ultra-Wideband (UWB). This is expanded upon further in our design changes. 

When searching for parts to purchase for our project, we realized that a lot of the robot chassis and other necessary components would arrive, at the earliest, after Fall break. Thus, we had to work around this and choose parts that would specifically arrive on time, primarily from Amazon, and integrate well with the rest of our systems.

A hardware risk that has emerged is that the motors in the robot kit might not have enough torque to allow the robot cart to maintain max speed with 30lbs of groceries. Since these motors are included in our robot’s kit, we will plan to use them for the basic testing and then subsequently upgrade our motors after testing, if needed. We are also looking into other motor options we’ll buy if necessary. We will have to consider how the new motors would fit into our robot’s design – for example, how to attach them to the robot if the hole patterns are different, and how to attach the motor coupler to the axle if it has a different diameter. We also need to consider if the new motor has a different amount of current or voltage needed, and if our H-Bridge can safely supply them without damaging itself.

Another risk we faced is that the robot’s chassis is limited to only supporting a load of 5-10kg (11-22lbs). This could be an issue since one of our use case requirements is that the cart can hold a maximum of 30 lbs. If needed, we will look into reinforcing the chassis, wheels, and axles to support more weight.

Design Changes & Justification

One major change we made was to use UWB instead of Bluetooth and GPS for shopper location tracking. Since iPhones already use UWB, and we plan on using an iPhone to host the mobile app, we decided to use UWB instead for more accurate indoor tracking and Apple’s Nearby Location framework. This adds another component to our project as we need to purchase a UWB module for the cart. 

Another change we made was integrating the barcode scanner into our mobile app and removing the macro tracking. We wanted to better integrate the two functionalities of our project, the autonomous following and item tracking, and decided to move the barcode scanning to the phone to add more functionality to the mobile app. Additionally, we decided to remove the macro tracking since it did not provide a strong use case for our users. Since we will no longer need to buy a barcode scanner, this helps offset the cost of buying the UWB module.

There are no major changes to our schedule.

 

Product Solution Meeting Needs

A was written by Audrey, B was written by Elly, and C was written by Rose.

Part A: Public Health, Safety, and Welfare 

Our solution meets the needs of users’ health and welfare by relieving stress on joints and back and lessening fatigue from pushing a heavy cart for a long time while grocery shopping. Instead, users can freely roam around the grocery store, and the robot will follow them. This also frees the hands of users, making shopping more accessible to those with disabilities, allowing elderly users with arthritis to avoid overextension, and allowing parents to shop while caring for children. 

Our solution meets the needs of users’ safety by using LiDAR obstacle avoidance to prevent collisions with shelves, people, or other carts. Using the UWB module, the cart will follow the assigned user and not get confused in crowded spaces. Additionally, automatic stopping when the assigned user is outside of range will ensure the safety of bystanders. 

Part B: Social Factors

Basket Buddy connects to social factors because it changes how people interact with grocery stores. By creating a simpler shopping experience where users no longer have to push the cart and have an easier way of keeping track of their items, Basket Buddy reduces the stress when grocery shopping, providing broader access to grocery shopping. This allows shoppers to focus on social interactions with other customers rather than worrying about keeping track of their shopping cart.  The built-in barcode scanner and item tracking feature support economic decision-making, reflecting the social and cultural factors around budgeting and diet. Different groups of people may use this feature to manage their shopping finances and dietary restrictions. Additionally, the mobile app feature that allows users to generate a barcode totalling all the items in their cart creates a faster checkout process, with less time spent in lines. Features like obstacle avoidance also help reduce disruptions and conflicts in crowded aisles.

Part C: Economic Factors

Basket Buddy is designed to make use of readily available components. By using parts such as a Raspberry Pi, a Teensy microcontroller, and a UWB module, it avoids the expense of custom fabrication. This approach cuts down on one-time engineering work and helps the system move from prototype to store deployment faster, which ultimately lowers the cost per unit when the cart fleet is rolled out.

Day-to-day operating expenses are also minimal. The cart handles store navigation itself, and item tracking is done on the user’s device, so the cart system doesn ot depend on costly networking or cloud services, reducing recurring data and hosting fees. Its one-hour battery life is sized to match a typical shopping trip without overspending on oversized power systems. These choices mean retailers can introduce the carts without major increases to their operating budgets, while shoppers gain a more convenient and time-saving experience. Over time, that efficiency and the improved customer experience can boost store traffic and sales, making it a worthwhile investment.

Audrey’s Status Report for 9/27

I worked on picking out the specific hardware components for this project. This included the robot chassis, wheels, motors + encoders, H-bridges, transformer/buck convertor, and power supply. 

For the robot chassis, I looked into kits that could support 30 lbs of load, but unfortunately, none of them would come promptly. I had to compromise and picked a chassis that can hold 22 lbs, and I am planning on reinforcing the chassis and axles if needed. The robot kit I settled on includes 4 omni wheels and motors with encoders.

When deciding the motor used, I also looked into the motor’s torque and rpm and did calculations to ensure the max speed can reach 4 mph (calculated using the robot kit’s wheel diameters and motor’s rpm) and that the torque of the motor should be greater than ~3N of force (calculated using the max load of groceries). I found some motors that fit this specification, but since the robot kit includes motors and the new motors have other factors to consider – such as how to attach them to the robot (which motor brackets) and axles (differing axle sizes), and if the H-bridge can power them safely – I decided not to immediately order them. Instead, I am planning on testing the initial PID and motor movement using the motors given in the kit and upgrading to the new motors once I figure out the solutions to this problem. 

As previously mentioned, I researched H-bridges to power the motors. I had to ensure they could safely handle a peak of 28A if the motors from the robot kit stalled.

My progress is on track this week. Since I can test the robot’s motor control and PID with the motors given in the kit and upgrade to better motors later, I am still on track.

Next week, I will be doing the design presentation. I will also be looking more into the motor issue and hopefully solving that by the end of the week.

Elly’s Status Report for 9/20

Our project will contain a mobile app that allows shoppers to view a list of items and the nutritional facts of the items in the cart. This week, I researched how to develop mobile apps and made a draft of the UI. The home page will be broken up into three sections. The first shows the total number of items, calories, and price at the top. The next section shows the macro breakdown with the total grams of protein, carbs, and fat. Finally, the third section will list out each product with the price and number of calories. When the user clicks on the gear in the top right, it brings them to the controls page, giving the user the option to start, stop, and disconnect from the cart.

To develop the app, I plan to use Xcode with Swift and SwiftUI to run the app locally on a phone. Core Bluetooth will be used to connect and facilitate communication between the app and the shopping cart, and Core Data will be used to make a local database of products.

Next week, I will start coding to create the basic framework for the app’s UI, which will be on track for our schedule.

 

 

Rose’s Status Report for 9/20

This week, I focused on figuring out how we would implement our shopper tracking system. In particular, I researched how we could use Bluetooth and GPS-based location tracking within a mobile app environment. On the hardware side, I identified that we would need to connect a Bluetooth module, a GPS module, and a compass module to our chosen controller (the Raspberry Pi). On the software side, I explored how our mobile app would need to function as an IoT application, constantly collecting data from and sending data to connected devices. This included reviewing how IoT applications manage real-time communication, data flow, and maintain reliable connections

According to our Gantt chart, I’m on schedule. The goal for this week was to complete the proposal presentation and begin researching different mechanisms of our system to prepare for finalizing the design and ordering our hardware components – which I’ve done.

Next week, I plan to:

  • Experiment with existing IoT software platforms, including Blynk, to test Bluetooth connections (using components I already own).
  • Work with the team to research and select the specific hardware parts we need to order (Bluetooth, GPS, and compass modules).

Team Status Report for 9/20

Overall Progress

  • Created and presented our Project Proposal Presentation.
  • Researched more about hardware and software components 
  • Drafted a general layout of our project

Risks & Management 

We received a question during our proposal presentation asking what happens when the user is out of range. We have decided on the following steps:

  1.  The cart detects when the user is out of range → We decided that the maximum range the user can be away from the cart before it goes into a safety stop is 6 feet, but the cart should try to maintain a distance of 2 feet from the user. 
  2. If the user is outside of the six-foot range, the cart stops → User receives a phone notification that they left their cart behind and to return to within 6 feet of the cart
  3. To reinitialize the cart, the user must return to the six-foot range.

In addition to the cart having a safety stop when out of range, the mobile app will also have an option to manually stop the cart and manually resume the cart when the user is back within the six-foot range. This allows users to pause the cart while entering crowded areas where it might not be able to navigate, and resume cart operation upon exiting the crowded area. 

Another question we received is whether the cart’s route is mapped to a pre-determined grocery store layout:

  • We designed the cart to follow the user, so the route is not based on a predetermined grocery store layout. Ideally, the user should be able to use this cart in any grocery store.

Design Changes & Justification

  • We added new features to the mobile app based on questions from the proposal (explained above). This helps catch edge cases where there might be unexpected behaviour from the users and ensures safety for users and bystanders, and should not add any additional costs to the project.
  • Other than that, no major design changes were made to the system.

Audrey’s Status Report for 9/20

I worked on selecting the microcontrollers that will control the motors and host the software side of the project, including the web app and LiDAR mapping. During this research, I gained a deeper understanding of the various specifications and performance capabilities that microcontrollers are designed to handle. I ensured that the microcontrollers I selected would meet the sensor and computational requirements of the project. I decided on using the Teensy 4.1 for the low-level motors and encoders, since it supports real-time feedback and low latency. I also decided on the Raspberry Pi 4 for the more computationally expensive and less time-critical tasks, such as the LiDAR mapping, obstacle detection algorithms, and web app.

According to the Gantt Chart, I am still on track to meet the fully fleshed-out hardware aspect of the design report in roughly two weeks.

Next week, I hope to pick out the specific sensors, such as the motors + wheel diameter, LiDAR sensor, etc. I will look at multiple industry standard options, weighing things like compatibility, cost, and torque, to determine which ones best meet the requirements of this project.