Forever’s Status Report for March 29th, 2025

WORK ACCOMPLISHED:

This week I was primarily focused on getting more continuous and accurate GPS data. We were able to receive an external GPS piece, called the Adafruit breakout GPS. Having an external GPS connected with our notecard allows us to simultaneously collect GPS data while keeping a cellular connection going. This can be super useful, for when we’re in areas that don’t have good wifi signal, we can depend on using the notecards cellular data to receive any API requests. It also allows us to get a more continuous stream of GPS data, which allows our location to be a bit more precise. I was also able to get the logic working that allows for our GPS to continuously run and update a file that’s being read from for our navigation system to work functionally. This connection would allow for a semi-working version of our navigation system.

PROGRESS:

I am almost done configuring the GPS portion of the navigation system,  however I haven’t gone through a lot of testing for the system. Which is where I currently should be based on our Gantt chart.

NEXT WEEK’S DELIVERABLES:

Clean up of the GPS system, testing of the system, and looking into 3D printing of our device encasing and wristband encasing.

Emmanuel’s Status Report for March 29th, 2025

WORK ACCOMPLISHED:

This week I continued working on the wristband subsystem.

I wrote code to the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range, but I haven’t been able to test it due to a road block with configuring the HC-05 bluetooth modules. I’m struggling to establish connection between to the two HC-05 modules because I have been unable to get a response from them individually using AT Commands on the Micro Arduino.  The AT Commands are needed to sync the modules and dictate which one is the “master”. I’ve tried various solutions I’ve seen online but will now pivot to trying to configure the HC-05 with the RPi4 instead because this seems to be a common issue with the micro Arduino.

PROGRESS:

I’m pretty behind on tasks right now, but if I’m able to establish how to send data I will make a significant leap. I wanted to have a functioning blindspot detection system for the interim demo by now. I will continue to work on  establishing connection between HC-05s throughout the weekend. I also noticed theres a really good radar sensor in the ECE inventory so I will put in a request for that.

NEXT WEEK’S DELIVERABLES:

Next week, I aim to have basic functionality of the blindspot detection subsystem and make any tweaks based on feedback from the interim demo.  Also, I will get a new radar sensor from ECE inventory.

Akintayo’s Status Report for March 22nd, 2025

WORK ACCOMPLISHED:

This week, I primarily worked on the navigation generation aspect of the project. Essentially, I worked on the code for suggesting the next direction instruction based on the user’s current location. Since we will begin working on integration of two distinct subsystems, one potential risk is how the subsystems are talking to each other. For now, the tentative solution is that the GPS subsystem will be periodically writing the user’s GPS location to a text file, and then the navigation subsystem will be reading that GPS location from that file. One issue that may arise from this is the timing between the two processes and how outdated the data may become based on the timing at which the navigation subsystem reads that data. Additionally, the accuracy of the GPS data will affect the functionality of the navigation system 

PROGRESS:

I am slightly behind with tasks; I would have liked to begin integration of the navigation and GPS functionalities by now.

NEXT WEEK’S DELIVERABLES:

For next week, I will be collaborating with other members of the team in integration of the navigation subsystem and the GPS subsystems in order to have a fully functional system that tracks the user’s GPS location in relation to the route for their journey.

 

Team Status Report for March 22nd, 2025

At this point in the project, we’re heavily focused on working through our individual parts. We have been working on the navigation piece, and trying to integrate it with the Raspberry Pi. We were able to get the GPS working and and are receiving GPS data such as longitude and latitude, however the results have not been as accurate as we wanted them to be. So we decided to move forward with triangulation as our primary method for determining where the user is – this has proved to be more accurate. Having said this,  we need to find a way to integrate their API for requesting triangulation data with our Raspberry Pi. 

Additionally, some progress was made in regards to the haptic feedback for the wristband system. We were able to set up a circuit on the mini breadboards, which allow the ERM motor to vibrate from a script on the micro Arduino. Time was spent learning how to use the HC-05 bluetooth module in order to send data that the Arduino can use to dictate when the motor should vibrate. We currently working on adding code to the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range. We are currently working on adding code to the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range.

This week, we also worked on the navigation generation aspect of the project. Essentially, we worked on the code for suggesting the next direction instruction based on the user’s current location.

RISK:

In relation to the accuracy of the current GPS system, if using the triangulation alternative is not getting accurate enough data, we might have a hard time determining when a user is heading down the wrong path. As a result, we might have to consider other GPS systems. 

To ensure the safety of our system, it is very important to establish the communication between the haptic feedback on the wristband and the object detection from the sensor on the bike.  Otherwise, there’s a major risk if objects are detected by the sensor but users aren’t warned through the vibration of the wristband. 

Since we will begin working on integration of two distinct subsystems, one potential risk is how the subsystems are talking to each other. For now, the tentative solution is that the GPS subsystem will be periodically writing the user’s GPS location to a text file, and then the navigation subsystem will be reading that GPS location from that file. One issue that may arise from this is the timing between the two processes and how outdated the data may become based on the timing at which the navigation subsystem reads that data. Additionally, the accuracy of the GPS data will affect the functionality of the navigation system 

NEXT WEEK DELIVERABLES:

For next week, we will be collaborating together to begin the integration of the navigation subsystem and the GPS subsystems in order to have a fully functional system that tracks the user’s GPS location in relation to the route for their journey and suggesting appropriate navigation instructions. 

In relation to the haptic feedback system, we wanted to be able to send data to the motor circuit from a python script by now. We aim to have this done later today though. We also may still need to find a better sensor, but want to make sure we can get basic functionality of the blindspot detection subsystem before we spend more time trying to improve accuracy.

Emmanuel’s Status Report for March 22nd, 2025

WORK ACCOMPLISHED:

This week I spent time working on our wristband system.

I was able to setup a circuit using the mini breadboards, that cause the ERM motor to vibrate from a script on the micro Arduino. I spent time learning how to use HC-05 bluetooth module in order to send data that the Arduino can use to dictate when the motor should vibrate. I’m currently working on adding code to the sensor script so it can send a signal to the HC-05 when an object is detected within a certain range.

PROGRESS:

I’m currently behind with tasks, I wanted to be able to send data to the motor circuit from a python script by now. I aim to have this done later today though. I also still need to find a better sensor, but want to make sure I can get basic functionality of the blindspot detection subsystem before I spend more time trying to improve accuracy.

 

NEXT WEEK’S DELIVERABLES:

Next week, I aim to have basic functionality of the blindspot detection subsystem and look into sensors available in the ECE inventory.

Forever’s Status Report for March 22nd, 2025

WORK ACCOMPLISHED:

At this point in the project we’re heavily focused on working through our individual parts. I’ve been working on the navigation piece, and trying to integrate it with the Raspberry Pi. I was able to get the GPS working and i’m receiving GPS data like longitude and latitude, however the results weren’t as accurate as we wanted it to be. So we decided to move forward with triangulation as our primary method for determining where the user is, this has proved to be more accurate, however I needed to find a way to integrate their API for requesting triangulation data with our raspberry pi. That’s what I’m currently working on, as well as integrating my piece with the navigation instructions. I am still considering alternative GPS modules that could make our gps unit more precise, like the Adafruit Ultimate GPS GNSS with USB – 99 channel w/10 Hz updates.

PROGRESS:

This week I was supposed to be done thoroughly testing the GPS unit, however I’m still in this process so I’m falling behind in work.

NEXT WEEK’S DELIVERABLES:

Finish thoroughly testing GPS unit and begin integrating with navigational instructions.

 

Emmanuel’s Status Report for March 15th, 2025

WORK ACCOMPLISHED:

This week I spent time creating a script to use and test the ultrasonic sensors along with completing the ethics report assignment.

A significant portion of my time was dedicated to creating a script for the ultrasonic sensors and researching the proper configuration settings for RPi4 when the using the Rx and Tx pins with different interfaces . The script allowed me to test the sensors object detection capabilities by printing out the distances of an object that’s detected.  I did some basic tests indoors, outdoors, while the sensors are stationary, and while they’re moving but still need to conduct stronger field of view tests to gain a better understanding of their accuracy. There are some risk concerns from testing that are in our team status report.  I also focused on completing the ethics report assignment. This involved delving into ethical considerations and principles relevant to the field, which not only enhanced my understanding of the subject but also allowed me to reflect on the broader implications our project.

PROGRESS:

I’m still currently slightly behind with tasks, I wanted to have the basic circuit for the wristband built right now, but I think I can make up time next week. There needs to be more exploration of the bluetooth module as well.

NEXT WEEK’S DELIVERABLES:

Next week, I aim to do more sensor testing, make a decision on wether to order a different sensor, and setup circuit for the wristband with a stretch goal of transmitting data to it through the bluetooth module.

Akintayo Status Report for March 15th, 2025

 WORK ACCOMPLISHED:

This week, a lot of time was spent testing the capabilities of different models for Google Speech-to-Text AI for extracting the destination for a journey from the user’s voice commands. After testing the different models, the decision was made to use the Chirp 2 model with model adaptation. The use of model adaptation is very important as it improves the accuracy of the recognition system. When testing, it was noticed that the system struggles with words that sound very similar such as “weather” and “whether”. As a result, with model adaptation, I can set a “boost value” for a phrase such as “weather” so that the system is optimized for identifying specific phrases.

Additionally, the logic for navigation suggestions was developed a bit and we have worked on some code that uses R-tree algorithms for identifying the appropriate navigation instruction based on the user’s real-time GPS location.

Snippet of Navigation code using R-tree algorithm:

Sample output:

PROGRESS:

I am currently on progress with my work.

NEXT WEEK DELIVERABLES:

For next week, we will work on building out the navigation system and handling the case when the user is completely off path. Also, we will be working on fabrication and 3D printing for the bike mount. Also, we will start looking at how to convert text of navigation to audio.

Forever’s Status Report for March 15th, 2025

WORK ACCOMPLISHED:

This week my primary focus was on ensuring the GPS information was being properly read in a way that we could use it for our navigation system. I did testing in different environments inside and outside. There were a lot of issues testing, since there needs to be a stable connection that allows for the GPS information to be read properly. I noticed that the most accurate location information being shown was when triangulation was used.  I also spent time on the ethics report, and did the team discussion surrounding the assignment.

Longitude and latitude information being displayed on the notehub map.

Testing code to view location, time, and connection information

Sample output when device isn’t moving ( gps isn’t showing ).

 

PROGRESS:

This week I accomplished the tasks I was supposed to, so I am currently on track.

NEXT WEEK’S DELIVERABLES:

Next week I hope to clean up any loose ends with the GPS information and format it in a way to be used by our navigation system. I also plan to help integrate with the rest of our systems.

Team Status Report for March 15th, 2025

In the beginning of the week we spent time discussing the ethics assignment. Ethics is an important factor in our project, and we wanted to make sure we were all on the same page. The majority of our time this week was spent on developing the three separate areas of our project. We started off in the beginning of the week working on our object detection. We had to configure the RPi to use the RT and TX pins for the sensor.  A script was created to detect the distance between an object and the sensor. We tested this script in different scenarios, with the sensor moving as well as the object moving back and forward. However, we’re planning on testing the field of view more to see if the accuracy of the sensor is enough for our project. Towards the latter part of the week we spent time on the navigation part of the project. We spent time testing the capabilities of different models for Google Speech-to-Text AI for extracting the destination for a journey from the user’s voice commands.  We ended up choosing the Chirp 2 model, due to it’s accurate speech recognition. We also worked on the GPS tracking, as it was something we wanted to have working by this week ( see below, sample output and code ). We were able to do some testing outside and were able to receive longitude and latitude measurements to be used for our navigation system ( see below, map image capture). 

In terms of risks that we’re looking at after this week, we’re considering a couple of factors. Since we’re unable to change our distance sensors range, we believe that theres a possibility for our sensors to be detecting other objects that are outside of the range that we’re worried about. This would slow down our processing time, so we’re currently testing to ensure that this isn’t the case. Another risk we’re worried about is the accuracy of our GPS information displaying our longitude and latitude. As we were testing this week, we were seeing longitude and latitude information being sent, however depending on the mode ( triangulation or GPS location), the location being displayed wasn’t as accurate as we wanted it to be. We’re doing continuous testing to see if this is only an issue in certain areas, and are considering using a different GPS system or better antennas.

Code + Sample output for Speech to Text system

Longitude and Latitude information being sent to notehub and displayed on notehub Map.

NEXT WEEK DELIVERABLES:

Mini-integration of all of our parts to see how they work together. We’re getting to the latter end of our project, so we’re attempting a mini integration to see how they fit together, incase we need to change things up. We are also considering ordering new parts, so doing research on different types of distance sensors as-well as new GPS modules.