Akintayo Status Report for April 12th, 2025

WORK ACCOMPLISHED:

In the past 2 weeks, I have been collaborating with members of the team in order to integrate the navigation and audio components with the entire system as a whole. Additionally, we were able to begin some testing of the audio input with GPS and navigation on a test route from Porter Hall to Phipps Conservatory. I have also worked on enabling the system to relay the navigation instructions to the user via a speaker. 

PROGRESS:

I slightly behind in regards to testing the subsystem and building out the audio feedback components. I will be working on catching up with this work this week.

NEXT WEEK’S DELIVERABLES:

For the upcoming week, we will be testing the GPS and navigation using sample GPS coordinates and real-time GPS coordinates while riding the bike. Additionally, there will be extensive testing of the audio aspect of the system for recognizing the user’s destination from their voice command. 

For the upcoming week, I will be primarily working on the audio feedback portion where the user can receive the navigation instruction via audio on a speaker and potentially a bluetooth headset. Additionally, validation and testing will be done to ensure that the navigation and audio feedback systems work as expected and provide the navigation instructions accurately and in a timely manner in accordance with the user design requirements.

TESTING:
1. For recognizing destination from voice commands:
testing with 5 different voices (from different people) for 20 different destinations within the Pittsburgh area and checking the accuracy from the output of the speech-to-text system
2. For accurate navigation instructions:
test multiple GPS coordinates on 10 different routes and check that the generated navigation instructions are accurate by comparing with the actual turns on a map. Additionally, I will be testing that the audio for the instructions works and returns the audio command with low latency

Team Status Report for April 12th, 2025

This week, we made solid progress on all subsystems of our project.

We were able to integrate all of our scripts and have them running simultaneously on the RPi4 . So, the GPS tracking, blindspot detection, and navigation instructions with voice recognition system are able to work together by using threading. The navigation script is able to get continuous updates of longitude and latitude from the GPS through a shared global variable.

There were many pivots with the wristband system. We replaced the Micro Arduino device for the wristband with the Blues Swan, which also has Arduino pairing capabilities. The reason for this change was due to the Swan including a PMIC accessible via a JST PH connector, this allows us to power the board with a LiPo battery but also recharge the battery if need be, through the USB port.  Additionally, we to changed our bluetooth system from operating with 2 HC-05 modules to using the RPi4 bluetooth and one HC-05 module because we were having trouble sending data with the previous setup. Lastly, one of the biggest changes we made was to switch from the ultrasonic sensors to the OPS243 Doppler Radar Sensor because during interim demo week the ultrasonic sensors were having connectivity issues. Also, it was apparent they would be insufficient in meeting in our use cases.

Our  audio and navigation components are integrated with our GPS tracking and we have been actively testing their accuracy and functionality by doing test routes ( ex. Porter Hall to Phipps Conservatory). Right now we have instructions being inputted manually but we are actively fleshing out kinks with the audio input.

We 3D printed our bike mount piece and got it to match the GoPro sizing and are actively working to have the wristband and main device encasing finished printed soon.

RISK:

In regards to the GPS subsystem, our encasing could potentially block satellite signals, if not positioned properly. Another separate risk, is the fact that HC-05 device and Bluetooth earbuds would need to be connected to the Raspberry Pi 4 simultaneously. Unfortunately, it may not be possible for the RPi to connect to multiple bluetooth devices. As a result, it is important that we spend sufficient time in attempting to integrate these bluetooth devices to the system to work at the same time. If that is not possible, then the audio aspects of the system will have to be done via a regular microphone and speaker that do not rely on Bluetooth connectivity. 

There is risk with our new sensors as well. There’s risk with the new OPS243 Doppler Radar Sensor because the field of view is very limited compared to the ultrasonic sensor. Although it’s fairly accurate in detecting incoming objects, its field of view to do so is very limited and poses a threat of objects being missed in a user’s blindspot. We might have to change or specify our guaranteed coverage zone in our final use case.

 

TESTING:

We are still early in our testing but overall it has been going well. We tested inputting a journey manually and having directions update based on a small sample set of GPS coordinates.  For our audio input portion we aim to test 5 different voices (from different people) for 20 different destinations within the Pittsburgh area and checking the accuracy from the output of the speech-to-text system. To test the navigation accuracy we’ll test multiple GPS coordinates on 10 different routes and check that the generated navigation instructions are accurate by comparing with the actual turns on a map. Our validation is ensuring that GPS coordinates are occurring in real-time and the navigation suggestion system is outputting the correct instruction.

In order to meet use case requirements such as users receiving audio instructions within 200 feet of a turn, the GPS system needs to accurately measure where the user is.  We’ve done a couple of bike trips to Phipps Conservatory and captured the longitude, latitude, and distance from turn for each of these trips to ensure the user is at the right location. Another test that has been done is putting the GPS sensor in a box, and reading GPS data while outside, to ensure that when the actual encasing is finished, we are not blocking GPS signals. We still need to test more locations and routes but so far testing as been going well and the results are aligning with our use case/

For the blind spot detection and wristband system, we have tested the basic functionalities. When stationary and indoors non moving objects infront of the sensors don’t trigger a wristband vibration but and incoming object at certain speed will.  Tested when objects are incoming at different angles relative to the sensor and it does have a more limited field of view than the ultrasonic sensors but it’s better at filtering out unnecessary objects. The wristband system is able to meet our use case in a limited setting. When the system is stationary, a vibration haptic feedback response is generated within a second of an incoming object being detected.  We still need to create an explicit plan to test the accuracy rate of the blindspot system. 

 

NEXT WEEK DELIVERABLES: 

We are primarily focused on extensively testing our subsystems independently.  We will  also work on getting all the encasings 3d printed to properly secure our project so that we can test our integrated systems on a bike.

 

Gantt

Forever’s Status Report for April 12th, 2025

WORK ACCOMPLISHED:

These past two weeks I was primarily focused on integrating all of our systems together. This meant working through different options of how to have all of our programs simultaneously running on the Raspberry Pi, and how information would be shared across each of the scripts. In order to accomplish this I created a main thread script that would start three individual threads for all of our programs to run. Then I added a shared global variable which would be used for continuous updating of the current GPS longitude and latitude. This longitude and latitude would then be used by our navigation script in order to properly give users updated navigation guidance. Through this we were able to see all three of our scripts working properly together, with multiple forms of information being shared.

In addition to working on the integration, I helped reconfigure the wiring for the wristband piece, making it cleaner and connected it with the RPi. Initially we were having issues with the bluetooth connectivity, as we were trying to connect two hc-05 devices together. But I was able to find a way to pair the hc-05 with the raspberry pi via bluetooth, and have data shared between the two devices. This connection can sometimes be broken when the RPi is turned off, so I created a script that automatically runs on the RPi’s startup to ensure they are always connected in range.  I also decided to replace the arduino device for the wristband with the Blues Swan, which also has arduino pairing capabilities. The reason for this change was due to the Swan including a PMIC accessible via a JST PH connector, this allows us to power the board with a LiPo battery but also recharge the battery if need be, through the USB port. I then configured the script to receive triggers for when to vibrate from our detection device.

TESTING:

In order to meet use case requirements such as users receiving audio instructions within 200 feet of a turn, the GPS system needs to accurately measure where the user is. In order to test this, we’ve done a couple of bike trips to Phipps Conservatory and captured the longitude, latitude, and distance from turn for each of these trips to ensure the user is at the right location. Another test that has been done is putting the GPS sensor in a box, and reading GPS data while outside, to ensure that when the actual encasing is finished, we are not blocking GPS signals. Testing has been going well for this as well, but for the future we plan on testing with different locations to ensure that the GPS signal is well received in a variety of areas.

PROGRESS:

I am currently making good progress, as I’m currently testing my own subsystem and also integrating my parts with the other group members.

NEXT WEEK’S DELIVERABLES:

For next week, I’m planning on 3D printing the encasing for the Rid3 device, ensuring that it fits all our parts. I also plan on helping connect our bluetooth headset with the overall system to ensure it’s connected properly.

 

Emmanuel’s Status Report for April 12th, 2025

WORK ACCOMPLISHED:

This week I worked on the wristband subsystem and creating the piece that will attach our device to the GoPro bike mount

A lot has changed in the last two weeks. In between the first and second interim demo I made a pivot from ultrasonic sensors because they started having connection issues with our raspberry pi. I switched to the OPS243 Doppler  Radar Sensor that was in the ECE inventory and was able to write a script to parse it’s data, and trigger a vibration on the wristband circuit whenever an object is detected at a particular speed and distance from sensor. This sensor solves the issue of detecting stationary objects because it measures speed relative to itself.

Additionally, I spent time 3d printing the bike mount piece that will connect the navigation device of our system to the GoPro bike mount.  It took a few iterations to get the right sizing to fit the mount and may need improvements once we get the main device encasing created. Lastly, I worked with Forever to change our bluetooth system from operating with 2 HC-05 modules to using the RPi4 bluetooth and one HC-05 module because we were having trouble sending data with the previous setup.

TESTING VERIFICATION:

I tested the basic functionalities of OPS243 and it work wells in doors when stationary. It does have a more limited field of view than the ultrasonic sensors but it’s better at filtering out unnecessary objects if I set the right configurations. Our encasing for our device has not been created yet so I can’t test on a bike yet.

The wristband system is able to meet our use case in a limited setting. When the system was stationary, a vibration haptic feedback response is generated within a second of an incoming object being detected. Unfortunately, the coverage area for the blind spot is minimal with the OPS243 sitting around 20 degrees laterally but for MVP we’re hoping this will suffice. I have yet to test the accuracy rate of the blindspot system but aim to do so next week.

PROGRESS:

I’m on pace with my tasks right now.  We were able to get all our subsystems running simultaneously, but need to push and test with all parts running while on a bike.

NEXT WEEK’S DELIVERABLES:

Next week, I aim to have an wristband design 3d printed and have the vibration system fit properly in it. I will also run non-stationary tests with the OPS  sensors.

Akintayo Status Report for March 29th, 2025

WORK ACCOMPLISHED:

This week, I completed integrating the navigation logic to the Raspberry Pi 4 system and began work to join the GPS and navigation subsystems together. Also, began testing different locations for the speech recognition part of the project. 

PROGRESS:

I am slightly behind with tasks; I would have liked to begin integration of the navigation and GPS functionalities by now.

NEXT WEEK’S DELIVERABLES:

For next week, I will be working on my individual subsystem in preparation for my interim demo and do more testing for my subsystem before integration. 

 

Team Status Report for March 29th, 2025

This week, we made good progress on getting more continuous and accurate GPS data. We were able to receive an external GPS piece, called the Adafruit breakout GPS. Having an external GPS connected with our notecard allows us to simultaneously collect GPS data while keeping a cellular connection going. This can be super useful, for when we’re in areas that don’t have good wifi signal, we can depend on using the notecards cellular data to receive any API requests. It also allows us to get a more continuous stream of GPS data, which allows our location to be a bit more precise. I was also able to get the logic working that allows for our GPS to continuously run and update a file that’s being read from for our navigation system to work functionally. This connection would allow for a semi-working version of our navigation system. 

For the haptic feedback subsystem, code was written for the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range. We haven’t been able to test it due to a road block with configuring the HC-05 bluetooth modules. We are facing issues with establishing connection between to the two HC-05 modules because we have been unable to get a response from them individually using AT Commands on the Micro Arduino. The AT Commands are needed to sync the modules and dictate which one is the “master”. We’ve tried various solutions I’ve seen online but will now pivot to trying to configure the HC-05 with the RPi4 instead because this seems to be a common issue with the micro Arduino.

Additionally, Completed integrating the navigation logic to the Raspberry Pi 4 system and began work to join the GPS and navigation subsystems together. Also, began testing different locations for the speech recognition part of the project.

RISK:

In regards to the GPS subsystem, there’s a risk of the GPS not having a clear view of the sky, if we have it in some sort of box, so we might need an external antenna that would have a clear view of the sky to get the most accurate GPS data. 

Considering the issues we’ve had with configuring the bluetooth network, there’s a major risk to the integrity of the system if objects are detected by the sensor but users aren’t warned through the vibration of the wristband. It’s important to establish this communication between systems using the HC-05 as soon as possible in order to reach the requirements for the system.

NEXT WEEK DELIVERABLES: 

We are primarily focused on completing our 3 different subsystems in preparation for the interim demos. Then, the rest of the week will be spent testing the individual subsystems before beginning integration.

Forever’s Status Report for March 29th, 2025

WORK ACCOMPLISHED:

This week I was primarily focused on getting more continuous and accurate GPS data. We were able to receive an external GPS piece, called the Adafruit breakout GPS. Having an external GPS connected with our notecard allows us to simultaneously collect GPS data while keeping a cellular connection going. This can be super useful, for when we’re in areas that don’t have good wifi signal, we can depend on using the notecards cellular data to receive any API requests. It also allows us to get a more continuous stream of GPS data, which allows our location to be a bit more precise. I was also able to get the logic working that allows for our GPS to continuously run and update a file that’s being read from for our navigation system to work functionally. This connection would allow for a semi-working version of our navigation system.

PROGRESS:

I am almost done configuring the GPS portion of the navigation system,  however I haven’t gone through a lot of testing for the system. Which is where I currently should be based on our Gantt chart.

NEXT WEEK’S DELIVERABLES:

Clean up of the GPS system, testing of the system, and looking into 3D printing of our device encasing and wristband encasing.

Emmanuel’s Status Report for March 29th, 2025

WORK ACCOMPLISHED:

This week I continued working on the wristband subsystem.

I wrote code to the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range, but I haven’t been able to test it due to a road block with configuring the HC-05 bluetooth modules. I’m struggling to establish connection between to the two HC-05 modules because I have been unable to get a response from them individually using AT Commands on the Micro Arduino.  The AT Commands are needed to sync the modules and dictate which one is the “master”. I’ve tried various solutions I’ve seen online but will now pivot to trying to configure the HC-05 with the RPi4 instead because this seems to be a common issue with the micro Arduino.

PROGRESS:

I’m pretty behind on tasks right now, but if I’m able to establish how to send data I will make a significant leap. I wanted to have a functioning blindspot detection system for the interim demo by now. I will continue to work on  establishing connection between HC-05s throughout the weekend. I also noticed theres a really good radar sensor in the ECE inventory so I will put in a request for that.

NEXT WEEK’S DELIVERABLES:

Next week, I aim to have basic functionality of the blindspot detection subsystem and make any tweaks based on feedback from the interim demo.  Also, I will get a new radar sensor from ECE inventory.

Akintayo’s Status Report for March 22nd, 2025

WORK ACCOMPLISHED:

This week, I primarily worked on the navigation generation aspect of the project. Essentially, I worked on the code for suggesting the next direction instruction based on the user’s current location. Since we will begin working on integration of two distinct subsystems, one potential risk is how the subsystems are talking to each other. For now, the tentative solution is that the GPS subsystem will be periodically writing the user’s GPS location to a text file, and then the navigation subsystem will be reading that GPS location from that file. One issue that may arise from this is the timing between the two processes and how outdated the data may become based on the timing at which the navigation subsystem reads that data. Additionally, the accuracy of the GPS data will affect the functionality of the navigation system 

PROGRESS:

I am slightly behind with tasks; I would have liked to begin integration of the navigation and GPS functionalities by now.

NEXT WEEK’S DELIVERABLES:

For next week, I will be collaborating with other members of the team in integration of the navigation subsystem and the GPS subsystems in order to have a fully functional system that tracks the user’s GPS location in relation to the route for their journey.

 

Team Status Report for March 22nd, 2025

At this point in the project, we’re heavily focused on working through our individual parts. We have been working on the navigation piece, and trying to integrate it with the Raspberry Pi. We were able to get the GPS working and and are receiving GPS data such as longitude and latitude, however the results have not been as accurate as we wanted them to be. So we decided to move forward with triangulation as our primary method for determining where the user is – this has proved to be more accurate. Having said this,  we need to find a way to integrate their API for requesting triangulation data with our Raspberry Pi. 

Additionally, some progress was made in regards to the haptic feedback for the wristband system. We were able to set up a circuit on the mini breadboards, which allow the ERM motor to vibrate from a script on the micro Arduino. Time was spent learning how to use the HC-05 bluetooth module in order to send data that the Arduino can use to dictate when the motor should vibrate. We currently working on adding code to the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range. We are currently working on adding code to the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range.

This week, we also worked on the navigation generation aspect of the project. Essentially, we worked on the code for suggesting the next direction instruction based on the user’s current location.

RISK:

In relation to the accuracy of the current GPS system, if using the triangulation alternative is not getting accurate enough data, we might have a hard time determining when a user is heading down the wrong path. As a result, we might have to consider other GPS systems. 

To ensure the safety of our system, it is very important to establish the communication between the haptic feedback on the wristband and the object detection from the sensor on the bike.  Otherwise, there’s a major risk if objects are detected by the sensor but users aren’t warned through the vibration of the wristband. 

Since we will begin working on integration of two distinct subsystems, one potential risk is how the subsystems are talking to each other. For now, the tentative solution is that the GPS subsystem will be periodically writing the user’s GPS location to a text file, and then the navigation subsystem will be reading that GPS location from that file. One issue that may arise from this is the timing between the two processes and how outdated the data may become based on the timing at which the navigation subsystem reads that data. Additionally, the accuracy of the GPS data will affect the functionality of the navigation system 

NEXT WEEK DELIVERABLES:

For next week, we will be collaborating together to begin the integration of the navigation subsystem and the GPS subsystems in order to have a fully functional system that tracks the user’s GPS location in relation to the route for their journey and suggesting appropriate navigation instructions. 

In relation to the haptic feedback system, we wanted to be able to send data to the motor circuit from a python script by now. We aim to have this done later today though. We also may still need to find a better sensor, but want to make sure we can get basic functionality of the blindspot detection subsystem before we spend more time trying to improve accuracy.