Team Status Report for April 12th, 2025

This week, we made solid progress on all subsystems of our project.

We were able to integrate all of our scripts and have them running simultaneously on the RPi4 . So, the GPS tracking, blindspot detection, and navigation instructions with voice recognition system are able to work together by using threading. The navigation script is able to get continuous updates of longitude and latitude from the GPS through a shared global variable.

There were many pivots with the wristband system. We replaced the Micro Arduino device for the wristband with the Blues Swan, which also has Arduino pairing capabilities. The reason for this change was due to the Swan including a PMIC accessible via a JST PH connector, this allows us to power the board with a LiPo battery but also recharge the battery if need be, through the USB port.  Additionally, we to changed our bluetooth system from operating with 2 HC-05 modules to using the RPi4 bluetooth and one HC-05 module because we were having trouble sending data with the previous setup. Lastly, one of the biggest changes we made was to switch from the ultrasonic sensors to the OPS243 Doppler Radar Sensor because during interim demo week the ultrasonic sensors were having connectivity issues. Also, it was apparent they would be insufficient in meeting in our use cases.

Our  audio and navigation components are integrated with our GPS tracking and we have been actively testing their accuracy and functionality by doing test routes ( ex. Porter Hall to Phipps Conservatory). Right now we have instructions being inputted manually but we are actively fleshing out kinks with the audio input.

We 3D printed our bike mount piece and got it to match the GoPro sizing and are actively working to have the wristband and main device encasing finished printed soon.

RISK:

In regards to the GPS subsystem, our encasing could potentially block satellite signals, if not positioned properly. Another separate risk, is the fact that HC-05 device and Bluetooth earbuds would need to be connected to the Raspberry Pi 4 simultaneously. Unfortunately, it may not be possible for the RPi to connect to multiple bluetooth devices. As a result, it is important that we spend sufficient time in attempting to integrate these bluetooth devices to the system to work at the same time. If that is not possible, then the audio aspects of the system will have to be done via a regular microphone and speaker that do not rely on Bluetooth connectivity. 

There is risk with our new sensors as well. There’s risk with the new OPS243 Doppler Radar Sensor because the field of view is very limited compared to the ultrasonic sensor. Although it’s fairly accurate in detecting incoming objects, its field of view to do so is very limited and poses a threat of objects being missed in a user’s blindspot. We might have to change or specify our guaranteed coverage zone in our final use case.

 

TESTING:

We are still early in our testing but overall it has been going well. We tested inputting a journey manually and having directions update based on a small sample set of GPS coordinates.  For our audio input portion we aim to test 5 different voices (from different people) for 20 different destinations within the Pittsburgh area and checking the accuracy from the output of the speech-to-text system. To test the navigation accuracy we’ll test multiple GPS coordinates on 10 different routes and check that the generated navigation instructions are accurate by comparing with the actual turns on a map. Our validation is ensuring that GPS coordinates are occurring in real-time and the navigation suggestion system is outputting the correct instruction.

In order to meet use case requirements such as users receiving audio instructions within 200 feet of a turn, the GPS system needs to accurately measure where the user is.  We’ve done a couple of bike trips to Phipps Conservatory and captured the longitude, latitude, and distance from turn for each of these trips to ensure the user is at the right location. Another test that has been done is putting the GPS sensor in a box, and reading GPS data while outside, to ensure that when the actual encasing is finished, we are not blocking GPS signals. We still need to test more locations and routes but so far testing as been going well and the results are aligning with our use case/

For the blind spot detection and wristband system, we have tested the basic functionalities. When stationary and indoors non moving objects infront of the sensors don’t trigger a wristband vibration but and incoming object at certain speed will.  Tested when objects are incoming at different angles relative to the sensor and it does have a more limited field of view than the ultrasonic sensors but it’s better at filtering out unnecessary objects. The wristband system is able to meet our use case in a limited setting. When the system is stationary, a vibration haptic feedback response is generated within a second of an incoming object being detected.  We still need to create an explicit plan to test the accuracy rate of the blindspot system. 

 

NEXT WEEK DELIVERABLES: 

We are primarily focused on extensively testing our subsystems independently.  We will  also work on getting all the encasings 3d printed to properly secure our project so that we can test our integrated systems on a bike.

 

Gantt

Team Status Report for March 29th, 2025

This week, we made good progress on getting more continuous and accurate GPS data. We were able to receive an external GPS piece, called the Adafruit breakout GPS. Having an external GPS connected with our notecard allows us to simultaneously collect GPS data while keeping a cellular connection going. This can be super useful, for when we’re in areas that don’t have good wifi signal, we can depend on using the notecards cellular data to receive any API requests. It also allows us to get a more continuous stream of GPS data, which allows our location to be a bit more precise. I was also able to get the logic working that allows for our GPS to continuously run and update a file that’s being read from for our navigation system to work functionally. This connection would allow for a semi-working version of our navigation system. 

For the haptic feedback subsystem, code was written for the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range. We haven’t been able to test it due to a road block with configuring the HC-05 bluetooth modules. We are facing issues with establishing connection between to the two HC-05 modules because we have been unable to get a response from them individually using AT Commands on the Micro Arduino. The AT Commands are needed to sync the modules and dictate which one is the “master”. We’ve tried various solutions I’ve seen online but will now pivot to trying to configure the HC-05 with the RPi4 instead because this seems to be a common issue with the micro Arduino.

Additionally, Completed integrating the navigation logic to the Raspberry Pi 4 system and began work to join the GPS and navigation subsystems together. Also, began testing different locations for the speech recognition part of the project.

RISK:

In regards to the GPS subsystem, there’s a risk of the GPS not having a clear view of the sky, if we have it in some sort of box, so we might need an external antenna that would have a clear view of the sky to get the most accurate GPS data. 

Considering the issues we’ve had with configuring the bluetooth network, there’s a major risk to the integrity of the system if objects are detected by the sensor but users aren’t warned through the vibration of the wristband. It’s important to establish this communication between systems using the HC-05 as soon as possible in order to reach the requirements for the system.

NEXT WEEK DELIVERABLES: 

We are primarily focused on completing our 3 different subsystems in preparation for the interim demos. Then, the rest of the week will be spent testing the individual subsystems before beginning integration.

Team Status Report for March 22nd, 2025

At this point in the project, we’re heavily focused on working through our individual parts. We have been working on the navigation piece, and trying to integrate it with the Raspberry Pi. We were able to get the GPS working and and are receiving GPS data such as longitude and latitude, however the results have not been as accurate as we wanted them to be. So we decided to move forward with triangulation as our primary method for determining where the user is – this has proved to be more accurate. Having said this,  we need to find a way to integrate their API for requesting triangulation data with our Raspberry Pi. 

Additionally, some progress was made in regards to the haptic feedback for the wristband system. We were able to set up a circuit on the mini breadboards, which allow the ERM motor to vibrate from a script on the micro Arduino. Time was spent learning how to use the HC-05 bluetooth module in order to send data that the Arduino can use to dictate when the motor should vibrate. We currently working on adding code to the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range. We are currently working on adding code to the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range.

This week, we also worked on the navigation generation aspect of the project. Essentially, we worked on the code for suggesting the next direction instruction based on the user’s current location.

RISK:

In relation to the accuracy of the current GPS system, if using the triangulation alternative is not getting accurate enough data, we might have a hard time determining when a user is heading down the wrong path. As a result, we might have to consider other GPS systems. 

To ensure the safety of our system, it is very important to establish the communication between the haptic feedback on the wristband and the object detection from the sensor on the bike.  Otherwise, there’s a major risk if objects are detected by the sensor but users aren’t warned through the vibration of the wristband. 

Since we will begin working on integration of two distinct subsystems, one potential risk is how the subsystems are talking to each other. For now, the tentative solution is that the GPS subsystem will be periodically writing the user’s GPS location to a text file, and then the navigation subsystem will be reading that GPS location from that file. One issue that may arise from this is the timing between the two processes and how outdated the data may become based on the timing at which the navigation subsystem reads that data. Additionally, the accuracy of the GPS data will affect the functionality of the navigation system 

NEXT WEEK DELIVERABLES:

For next week, we will be collaborating together to begin the integration of the navigation subsystem and the GPS subsystems in order to have a fully functional system that tracks the user’s GPS location in relation to the route for their journey and suggesting appropriate navigation instructions. 

In relation to the haptic feedback system, we wanted to be able to send data to the motor circuit from a python script by now. We aim to have this done later today though. We also may still need to find a better sensor, but want to make sure we can get basic functionality of the blindspot detection subsystem before we spend more time trying to improve accuracy.

Team Status Report for March 15th, 2025

In the beginning of the week we spent time discussing the ethics assignment. Ethics is an important factor in our project, and we wanted to make sure we were all on the same page. The majority of our time this week was spent on developing the three separate areas of our project. We started off in the beginning of the week working on our object detection. We had to configure the RPi to use the RT and TX pins for the sensor.  A script was created to detect the distance between an object and the sensor. We tested this script in different scenarios, with the sensor moving as well as the object moving back and forward. However, we’re planning on testing the field of view more to see if the accuracy of the sensor is enough for our project. Towards the latter part of the week we spent time on the navigation part of the project. We spent time testing the capabilities of different models for Google Speech-to-Text AI for extracting the destination for a journey from the user’s voice commands.  We ended up choosing the Chirp 2 model, due to it’s accurate speech recognition. We also worked on the GPS tracking, as it was something we wanted to have working by this week ( see below, sample output and code ). We were able to do some testing outside and were able to receive longitude and latitude measurements to be used for our navigation system ( see below, map image capture). 

In terms of risks that we’re looking at after this week, we’re considering a couple of factors. Since we’re unable to change our distance sensors range, we believe that theres a possibility for our sensors to be detecting other objects that are outside of the range that we’re worried about. This would slow down our processing time, so we’re currently testing to ensure that this isn’t the case. Another risk we’re worried about is the accuracy of our GPS information displaying our longitude and latitude. As we were testing this week, we were seeing longitude and latitude information being sent, however depending on the mode ( triangulation or GPS location), the location being displayed wasn’t as accurate as we wanted it to be. We’re doing continuous testing to see if this is only an issue in certain areas, and are considering using a different GPS system or better antennas.

Code + Sample output for Speech to Text system

Longitude and Latitude information being sent to notehub and displayed on notehub Map.

NEXT WEEK DELIVERABLES:

Mini-integration of all of our parts to see how they work together. We’re getting to the latter end of our project, so we’re attempting a mini integration to see how they fit together, incase we need to change things up. We are also considering ordering new parts, so doing research on different types of distance sensors as-well as new GPS modules.

Team Status Report for March 8th, 2025

Our current progress on the Rid3 device is going well. The main tasks for this week were to complete the setup for raspberry pi, finish setup for the blues starter kit, and get basic object detection with sensors. We were able to accomplish most of these goals since our last progress report. We have the raspberry Pi set up and have made it compatible with the blues starter kit by using ssh for programming on the Pi. In addition, we worked on configuration for converting speech to text using the Google Speech API and were able to see sample outputs. We also ordered materials for a new mount for our bicycle which we designing a component that is compatible with the mount ( see below ).

One of the risks that we are currently facing is the fact that the GPS information is not being sent properly through to the Raspberry Pi, this could be a problem as we need accurate GPS data for the proper directions to be sent.

NEXT WEEK DELIVERABLES:

Continue testing for speech-to-text translation and beginning implementation of R-Tree algorithm. Set up circuit for wristband + establishing basic object detection for sensors. Fixing GPS issues and storing GPS data to be used by the RPi.

.

component compatible with mount.

sample output for Google speech API

ADDITIONAL QUESTIONS:

Part A: … with consideration of global factors. Global factors are world-wide contexts and factors, rather than only local ones. They do not necessarily represent geographic concerns. Global factors do not need to concern every single person in the entire world. Rather, these factors affect people outside of Pittsburgh, or those who are not in an academic environment, or those who are not technologically savvy, etc.

There is a global need for increased safety and accessibility in urban mobility. Our device Rid3 can help meet this increased need for bicyclists’ safety by enhancing phone-less navigation in increasingly crowded urban settings. Bike lanes and road support for micro-mobility are being expanded in many major cities worldwide, however riders are often still at risk due to poor visibility, car blind spots, and distractions from checking navigation devices. This technology lowers the risk of accidents and increases traffic safety by enabling cyclists to receive crucial blind spot alerts and clear directions without taking their eyes off the road by fusing voice navigation with a haptic feedback wristband. This solution is particularly impactful in regions where cycling infrastructure is still developing or where road conditions are less predictable. Additionally, our device is intended to work in various climates expanding beyond what typically occurs in Pittsburgh, like high dust environments. The usability of our device is also simplistic and meant to be intuitive to promote use for all people regardless of technological expertise. By enhancing safety in diverse environments, the product contributes to broader global efforts to promote sustainable transportation, reduce urban congestion, and improve public health. A was written by Emmanuel.

Part B: … with consideration of cultural factors. Cultural factors encompass the set of beliefs, moral values, traditions, language, and laws (or rules of behavior) held in common by a nation, a community, or other defined group of people.

Within the context of our project, the main cultural factor for consideration is the fact that different countries have different road laws hence it is important that the Rid3 devices adhere to these cultural norms. Specifically, it is important that our device functions in a way that it is intuitive to a biker on the road. Consequently, it is essential that the audio feedback for navigation instructions adheres to road safety rules. Consequently, research will have to be done to ensure that the system’s feedback adheres to the rules of the region. B was written by Akintayo 

Part C : … with consideration of environmental factors. Environmental factors are concerned with the environment as it relates to living organisms and natural resources.

Environmental factors play an important role in our project. We want to make sure that we’re using resources that do not pollute the environment and are safe for the environment. It is battery powered and does not release any toxins into the air when it is running, so the environmental concerns are limited. One thing to take note of is the potential for coming across different animals in the environment. If we detect animals in our blind spot, we want to notify the user so that they don’t hit the animal coming across. C was written by Forever.

Team Status Report for February 22nd, 2025

This week we started receiving materials for our project and began testing their functionalities. We received our sensors, blues starter kit, microphone, batteries, and our raspberry pi hat. Our current risk is  stable mounting of our device to bikes. Our current implementation relied on a velcro strap which we realized after using bikes this week will be unstable.. Most bicycles use a mounting clamp to add accessories so we’ll do the same. We’re redesigning our encasing to be compatible with the NiteRider Tail Light Strap Mount. 

Since last week we decided to eliminate the web server aspect and will have all of the server processing happen on the raspberry pi. Two main reasons for this: sending audio to blues cloud is very inefficient due to the round trip latency from sending back and forward through the cloud to an internet-hosted server.  Thus not using the Blues cloud will reduce latency and it’ll be cheaper since we won’t be charged for each block of data being sent to the Raspberry Pi.

Sample Design of Encasing for Navigation/Audio System

 

Sample Design for Wristband  

New framework for navigation portion

NEXT WEEK DELIVERABLES:

Main tasks for upcoming week: complete setup for raspberry pi, finish setup for blues starter kit, get basic object detection with sensors.

Team Status Report for February 15th, 2025

This week we decided to flesh out the overall design for our project and made final decisions regarding the materials that we needed, which led to us submitting our requests for hardware and software. The most significant risk that exists at the current stage of our project would be processing power and latency. We currently have a lot of moving pieces connected to the raspberry pi for our design. Which means it needs to be able to sufficiently supply enough power to each of these pieces as well as respond quickly enough for information sharing. This is why we decided to use a Raspberry PI 4 versus our previous usage of the Raspberry Pi 3A+. It is a lot more capable, but still does not require as much processing power / energy as the Raspberry PI 5. We are also looking into usage of resistors + transistors for managing of power, so that we don’t have too much energy flowing in areas where we don’t need it. Another design change we made was switching from ToF sensors to Ultrasonic sensors. This was due to the fov of the sensors, we wanted to make sure we were able to capture a breadth of information instead of less. We also decided to use a USB microphone, just to simplify the pin out process, and make it easier to connect other pieces. The changes didn’t create a noticeable cost difference as the pieces were around the same price.  See below updated block diagram and schedule.

Additional Questions:

Part A: … with respect to considerations of public health, safety or welfare. Note: The term ‘health’ refers to a state of well-being of people in both a physiological and psychological sense. ‘Safety’ is the absence of hazards and/or physical harm to persons. The term ‘welfare’ relates to the provision of the basic needs of people.

By giving bicyclists a hands-free, real-time navigation and blind spot detection system, our solution improves public health, safety, and welfare. Conventional navigation techniques, such as looking up instructions on a phone, can be distracting and raise the risk of accidents. By wirelessly delivering obstacle alerts and directional indications to a vibration-based wristband, our device removes this risk and enables bikers to focus on the road. This improves rider awareness and considerably lowers cognitive overload, which benefits both physical and mental safety. The incorporation of ultrasonic sensors guarantees prompt identification of cars or other objects in blind spots, averting possible collisions and promoting safer interactions between motor vehicles and bicycles.

Furthermore, by promoting safer and more effective cycling, which is an eco friendly and healthful form of transportation, our solution advances public welfare. Our device adds to urban mobility solutions that benefit communities and individuals by increasing bike accessibility through a cheaper navigation tool and lowering the chance of accidents. Accurate route guiding is ensured by the Blues Starter Kit’s GPS tracking and server-based navigation system, which keeps riders from getting lost and lowers stress, both of which can improve general wellbeing. In the end, our product promotes a more secure and safe riding experience, enhancing public safety and encouraging a more sustainable and healthy transportation culture. A was written by Emmanuel.

Part B: … with consideration of social factors. Social factors relate to extended social groups having distinctive cultural, social, political, and/or economic organizations. They have importance to how people relate to each other and organize around social interests.

One important social factor that needs to be considered in regards to the project is the speech component. Specifically, the system will be receiving the user’s destination for their journey via their voice. As a result, different people have different speech intonations in their voice. In the same vein, people speak in different languages. For the scope of the project, we are solely focusing on ensuring the speech recognition and the audio response is functional primarily for English speakers. To handle different speech tones, we are using the Google Speech-to-Text AI system to handle this as it is optimized to identify a variety of speech types.  B was written by Akintayo.

Part C: ... with consideration of economic factors. Economic factors are those relating to the system of production, distribution, and consumption of goods and services. 

One of the general themes that our group wanted for our project was the idea of a low cost low profit  model. Essentially, we aim to prioritize consumer satisfaction and accessibility to our product over company profits. Therefore, important factors that we must consider to achieve this goal are cost of production and cost of labor. At the moment the cost of our product seems kind of high relative to the alternative options that are our there. For example, google maps navigation paired with an apple watch. However, most of the products do not come with a haptic feedback feature that provides additional safety for bike users. So as we’re building we’re looking for cost efficient parts, that are lightweight and user friendly, but also aren’t too unreasonable due to the safety the product provides. Our highest priority is safety and ensuring that those needs are met before all else.  C was written by Forever.

 

 

Team Status Report for February 8th, 2025

One of the main concerns for the project is ensuring the functionality of the sensors is fully in-line with the intended use case. Specifically, we still need to decide whether we are using either an ultrasonic, laser or LIDAR sensor for identifying objects in the blindside (rear) of the user’s vehicle. Based on which sensor is selected, it then determines where the system needs to be located on the vehicle. In the same vein, it is important that we research and test all possible options in order to make an informed decision.

Following this, another consideration for the project is connecting the data from the “object detection” sensor and sending this to the vibration system on the wrist band in a timely manner.

 

NEXT WEEK DELIVERABLES:

One main task for the upcoming week is making the purchase requests for the different hardware and software components for our system.