Elly’s Status Report for 11/8

This week, I worked on creating a Pygame simulation to observe the behavior of the cart when using the virtual force algorithm. The simulation works by grabbing 20 LiDAR scans of the environment and creating a virtual mapping of the environments. From here, the shopping cart is added into the environment and the computer’s mouse acts as the shopper. While you move the mouse, the shopping cart, which is visualized as a red dot, attempts to follow the mouse and navigate any potential obstacles. When the mouse is far away from the cart, the cart speeds up to catch up to the mouse. For the most part, the virtual force algorithm does a good job of navigating the obstacles, but there are cases in which the cart gets stuck.

I am still behind in progress since we have not been able to integrate the parts together since we still have not received our Raspberry Pi that we ordered. Instead, I have been focusing on trying to get the obstacle detection to work on its own so there will be less debugging when integrating with the motors. Hopefully our Raspberry Pi will be delivered next week so we can make progress with integration.

Demo

Audrey’s Status Report for 11/8

This week, I worked on the robot’s movement. I received new H-Bridges this week that replaced the faulty ones I was using last week. I tested connection points between the Teensy microcontroller and the motors. I got all 4 motors to work synchronously. I wrote functions to make the robot move in various directions. I tested the motors’ encoders and found that there was an error with them.  I believe it was due to the connection between the Teensy microcontroller and the motors, so I ordered a Dupont connector kit that I hope will solve this issue. I also wrote code that handles interrupts and reads commands from a USB device to control the movements of the robot. This mimics the communication between the Raspberry Pi 5 and Teensy controller, considering we don’t have a Raspberry Pi 5 yet. 

 

Overall, I am a week behind compared to the Gantt chart. I realized that encoders and PID control aren’t needed for the interim demo, so I am focusing on other aspects of the project in order to be demo-ready. Next week, I will work on presenting our interim demo, along with regaining progress when the Raspberry Pi 5 arrives, in regards to the communication between the microcontrollers. Also, I plan on figuring out the issue with the encoders.

Rose’s Status Report for 11/1

This week, I worked on connecting the receiver iPhone to the Raspberry Pi to begin transferring UWB position data to the cart’s control system. Unfortunately, I discovered that the Raspberry Pi 5 we borrowed from the ECE inventory has a broken SD card holder, which prevents it from booting or running any code. It had old solder on the pins, which likely means that others have also previously tried to fix it with no success. I immediately ordered a replacement that should arrive next week (Amazon Prime). In the meantime, I decided to make more progress with the software side. I wrote the Flask server code that will run on the Raspberry Pi to handle incoming data from the receiver iPhone. Although the code has not been tested yet due to the hardware issue, it is fully written and ready for deployment (and testing) once the new Pi arrives. 

I also spent some time researching the trade-offs between using Bluetooth and HTTP as the communication method between the receiver iPhone and the Raspberry Pi. I found that HTTP over Wi-Fi has higher data throughput and stability for streaming continuous position updates, while Bluetooth has lower power consumption and a simpler pairing method, as it does not rely on a Wi-Fi connection. However, Bluetooth has limited bandwidth and higher latency, which makes it less reliable for real-time data updates. Based on this, I decided that using HTTP over Wi-Fi is ultimately the more practical choice for our project at this stage.

Overall, I remain behind compared to the Gantt chart because the broken Raspberry Pi delayed my progress. Next week, once the new Raspberry Pi arrives, I plan to set up the Flask server, test end-to-end communication between the iPhone and Raspberry Pi, and look into how we can translate those coordinates into motion controls without obstacle avoidance for a preliminary prototype. My goal is to catch up on lost time and have the full UWB-to-cart communication working for the interim demo.

Team Status Report for 11/1

Overall Progress

  • Tested LiDAR sensor
  • Tested H-Bridges for motor control

Risks & Management 

One of the risks for the project is the issue with the H-Bridges. Since they were received as faulty, we were unable to test the motors and make significant progress on the robot’s movement. Thus, we bought new H-Bridges to replace the faulty ones and are looking into getting those H-Bridges refunded. If these new H-Bridges don’t work as intended, we are considering buying a robot car kit online to remove the risk of compatibility issues between components, so that our Interim Demo can show the code’s progress.

Additionally, the Raspberry Pi 5 we borrowed from the ECE inventory had a broken SD card holder, which made it so that we were unable to test the receiver iPhone to Raspberry Pi connection. To manage this, we ordered a new Raspberry Pi 5 immediately and continued to work on the software part of the communication. If the new Raspberry Pi experiences delays or if additional hardware issues arise, we can temporarily use a laptop-based server for testing.

Design Changes & Justification

In addition to using D* Lite for pathfinding, we also decided to test out the virtual force algorithm for local navigation as a way to avoid obstacles. This algorithm uses an attractive vector pointing from the robot to the next position and a repulsive vector pointing from the obstacle to the robot. These two vectors are added up to create the direction of motion. We decided to use this because it provides a reactive, real-time, obstacle avoidance. D* Lite by itself only provides the long-term goal of getting the cart to the user, rather than the short-term motions.

Audrey’s Status Report for 11/1

This week, I worked on getting the motors to move. Unfortunately, I found out the H-Bridges for the motor control are not working. I believe it is an issue with the shipping or manufacturing of the component. I received the H-Bridges with the same electrical components on the board, either capacitors or transistors, bent flat. I bent them back into an upright position, but I think this bending loosened the connections. When I was testing my code, the H-Bridges were making a buzzing sound and heating up significantly. Thus, I ordered new H-Bridges and am hoping this will fix the issue. If this doesn’t work, I will consider buying a robot car kit online to remove the risk of compatibility issues between components. Other than this issue, I worked on wiring the robot and connecting all the components together, as well as further developing the Arduino code for the robot. 

 

Overall, I remain behind compared to the Gantt chart. Next week, when the new H-Bridges arrive, I hope that the motors will work and that I can continue to regain progress so that other aspects of the project can be implemented and tested according to the schedule in a timely manner.

Elly’s Status Report for 11/1

I worked on experimenting with the LiDAR sensor to see what outputs it produces. We are using the SLAMTEC RPLIDAR A1. To get started, I used the provided software development kit which contains starter code to get the inputs from the sensor and prints out the angle and distance of an obstacle in reference to the sensor. We plan to use D* Lite as the global navigation and the virtual force algorithm as the local navigation. I focused on implementing the virtual force algorithm first while referencing a Python implementation of the algorithm. Since we have not had the chance to integrate the LiDAR with the motors or the UWB connection, I decided to test the algorithm by creating a Pygame simulation to see if the cart could steer clear of obstacles. The simulation shows how the cart moves and changes direction as obstacles appear into its view. In the video, the red dot represents the shopping cart and the yellow line indicates the front of the shopping cart and the direction it is moving in. The blue points all represent obstacles that the LiDAR has sensed. 

I am behind schedule since I haven’t had the opportunity to test the LiDAR with the motors, so I am unsure about the complete implementation and integration details. I plan to help out with other aspects to try and catch us up to a point where we can integrate all the parts together. 

Demo 

Code (can be found in app/simple_grabber)

LiDAR Datasheet 

Python Implementation 

Virtual Force Algorithm