Akintayo Status Report for April 26th, 2025

WORK ACCOMPLISHED:

This week, I primarily worked on testing for the audio recognition aspect of the project and fixing some integration issues it had with the rest of the system. Moreover, I worked with other members on the team to carry out test journeys by attaching are device to Pogoh Bikes and riding them around the campus. We were able to test different aspects of the projects e.g. route audio feedback and haptic feedback.

PROGRESS:

I did not do as much testing as I would have liked, so I am slightly behind. So, I will have to make time this week to do extra testing.

NEXT WEEK DELIVERABLES:

I will primarily just continue testing the audio + navigation aspects of the device; and plotting and documenting the test results in order to see how it matches up with our design and use-case requirements. 

TESTING:

Did some testing

 

Team Status Report for April 26th, 2025

This week was mainly focused on unit testing, holistic testing, and fixing any bugs with our project. We focused on issues dealing with the GPS system. In addition, we also made changes to our encasing. During our testing the device fell, which weakened the connection of our sensor to the acrylic case. We decided to apply more glue for a better stick, and extended the wire, so that there was less pressure pulling on the sensor when connected to the Raspberry Pi. 

Image: Broken Joint

Image: New Case

RISK:

One risk we noticed was that it takes a considerable amount of time for our GPS module to lock in with a satellite ( > 5 minutes ). So, we changed the orientation of the GPS module to see if that would result in a shorter connection time. This seems to have reduced the problem because we are able to connect to our device in less than 5 minutes. 

TESTING

We did comparative testing to see if the current location of users being shown by our gps script, matched that of which the user actually was. In order to have a reference point, we used the IPhone’s internal GPS as a standard measure of GPS accuracy.We used this to map out four points on each journey route we had, and measured the differences in latitude and longitude as a metric for the accuracy of the GPS module.

We ran 50+ tests this week with the doppler radar sensor to measure its field of view and accuracy. Through testing, we noticed the orientation of the sensor played a large part in how accurate the distance detection was so  removed the sensor from our encasing and changed the orientation. Also, the sensor had no false positives as it never detected an object that wasn’t there. Lastly, since the sensor’s FOV is so limited it’s best for our device to be attached slightly angled towards the left hand side of the user to get a better view of incoming objects as people usually take a wider angle when passing bicyclists and don’t come up directly behind them.

For our overall system testing , we spent a few hours this week riding a bike with Rid3 to collect data. We also simulated certain scenarios with a car and on another bike. Various kinks arose with testing off campus that we had to resolve throughout the week. Once again, one of our biggest setbacks was the bike mount piece broke while we were testing. The device fell and our encasing broke so I took time this week to put it back together and reprint a sturdier mount. This was an inadvertent strength test of our encasing as our device held up pretty well and there were no cracks in the encasing, we only had to attach the sides back together.

NEXT WEEK DELIVERABLES:

We will primarily be focused on wrapping up our project and preparing for the final demo. We will continue testing to ensure the product performs as expected.

 

Emmanuel’s Status Report for April 26th, 2025

 WORK ACCOMPLISHED:

This week was focused on creating the final presentation, unit testing the blindspot detection system, and testing of our overall system.

I ran 50+ tests this week with the doppler radar sensor to measure it’s field of view and accuracy. Through testing I noticed the orientation of the sensor played a large part in how accurate the distance detection was so I removed the sensor from our encasing and changed the orientation. Also, the sensor had no false positives as it never detected an object that wasn’t there. Lastly, since the sensors FOV is so limited it’s best for our device to be attached slightly angled towards the left hand side of the user to get a better view of incoming objects as people usually take a wider angle when passing bicyclists and don’t come up directly behind them.

For our overall system testing , I spent a few hours this week riding a bike with Rid3 to collect data. I also simulated certain scenarios with my car or on another bike while my teammates were riding a bike with Rid3. Various kinks arose with testing off campus that we had to resolve through out the week. One of our biggest setback was the bike mount piece broke while we were testing. The device fell and our encasing broke so I took time this week to put it back together and reprint a sturdier mount. This was an inadvertent strength test of our encasing as our device held up pretty well and there were no cracks in the encasing, we only had to attach the sides back together.

PROGRESS:

I’ve completed all my tasks on our gantt chart and made solid progress this week.

NEXT WEEK’S DELIVERABLES:

Next week I’m focusing on visualizing the data I collected along with creating the final poster, report, and demo video.

Forever’s Status Report for April 26th, 2025

WORK ACCOMPLISHED:

This week was mainly focused on unit testing, holistic testing, and fixing any bugs with our project. I was mainly focused on issues dealing with the GPS system. Through our testing we realized that it takes a considerable amount of time for our GPS module to lock in with a satellite ( > 5 minutes ). So I changed the orientation of the GPS module to see if that would result in a shorter connection time. This seems to have reduced the problem because we are able to connect to our device in less than 5 minutes.

In addition, we also made changes to our encasing. During our testing the device fell, which weakened the connection of our sensor to the acrylic case. We decided to apply more glue for a better stick, and extended the wire, so that there was less pressure pulling on the sensor when connected to the raspberry pi. I also did comparative testing to see if the current location of users being shown by our gps script, matched that of which the user actually was. In order to have a reference point, I used the Iphone’s internal GPS as a standard measure of GPS accuracy. I used this to map out 10 points on each journey route we had, and measured the differences in latitude and longitude as a metric for the accuracy of the GPS module.

I also assisted Emmanuel and Akintayo with their own unit testing for the week.

PROGRESS:

Currently we are where we are expected to be in the project, where we’re doing more cleanup and continuous testing to ensure it is in MVP form for the final week.

NEXT WEEK DELIVERABLES:

An MVP level design needs to be presented for next week. Expecting to have a good working demo, presentation poster for presentation and testing data.

 

 

 

Team Status Report for April 19th, 2025

This week, we made finished the integrating all our hardware pieces together. We have encasings for the raspberry Pi which is holding our GPS, battery, fan, and our distance sensor. We have the encasing for our wristband, which we can feel vibrations through when the sensor detects something in it’s field of view. We also decided to pivot from having a speaker + microphone attached to the raspberry Pi. Instead we are using a wireless headset which is acting as both a microphone and a listening device for the instructions. We were able to integrate this headset with the Raspberry Pi, using a bluetooth connection. After we got the encasings for all the devices we attached the connection to the gopro mount, and it is now able to be put on our bicycle for wholistic testing.

In terms of testing, we did more testing with our speech to text and our text to speech because we decided to use another speech model that was better for our bluetooth headset. This model also seemed to have more accuracy than the previous API we were using. We also tested our distance sensor to make sure it was good for detecting the motion of cars and when the user is in motion with the sensor.

RISKS:

There is potential strength of the connection piece for the mount not being strong enough to hold our device, so we need to see how it is riding quickly different terrain. There is also the risks of bluetooth connection not being the best with devices moving and every device being in encased, which could lead to choppy connection at times, but we will see with testing.

NEXT WEEK DELIVERABLES: 

Continuous testing for the device still needs to be done, testing with actual cars and different journeys. We also need to get varying user feedback on the practicality of the device. We will be working on final presentations and getting working demo videos out.

Akintayo Status Report for April 19th, 2025

WORK ACCOMPLISHED:

This week, I worked with other members of the group in order to integrate the bluetooth audio earpiece to the Raspberry Pi.  In doing this, instead of relying on a separate microphone and speaker, the user is able to give the destination through voice commands and hear the navigation instructions on one modular device – achieving the requirement that the system is hands-free. Teammates also provided a more efficient and accurate speech-to-text endpoint to use. Work had to be done to integrate this update to the existing navigation code. Additionally, we have begun testing the individual subsystems. I have been testing the accuracy of the speech-to-text framework for extracting the user’s destination. 

PROGRESS:

I, and the rest of the team, made good progress this past week, hence we are on schedule.

NEXT WEEK DELIVERABLES:

This upcoming, I will be primarily focusing on testing the audio feedback systems for the navigation instructions and verifying that the appropriate navigation instruction is produced based on a user’s GPS coordinates and verifying these outputs by looking at actual Google Maps routes.

NEW LEARNING:

One new skill I developed during the development of this project is the ability to analyze API documentation, particularly when using the Google Maps Routes and Geocoding API. I learned how to navigate complex documentation with different API endpoints, identify the endpoints and parameters relevant to my use case, and implement the APIs efficiently in my code. This skill is very important as sometimes it is not necessary to reinvent the wheel, but rather one can use existing technologies in an efficient manner.

Forever’s Status Report for April 19th, 2025

WORK ACCOMPLISHED:

This week the primary focus for me was building the encasing for the device that we’re attaching to the back of the bicycle. We went through with 3D printing an initial model, but it didn’t fit all our building materials. We were planning on 3D printing another encasing for this, but we decided to move forward using acrylic as the material and using heat bending to form the case. So I made the 2D specifications for the case and laser cut the Acrylic. Acrylic was a good option since it’s durable and allows us to check into the device to ensure that everything is working properly. I also did some drilling to adjust the port positions.

Another focus of mine this week was connecting our new bluetooth device with our Raspberry Pi. In order for this to work we had to change the Google API we were using for text to speech and speech to text. I created a working script for the usage of this new API, which Akintayo was able to integrate into our navigation script. I had to change the configuration profile for our bluetooth in order for both the serial bluetooth and the bluetooth headset to work in unison.

NEW KNOWLEDGE:

When I first started this project, I had very little knowledge of the tools that would be used. The first tool I had to work to understand was the Blues kit, which was the first device we were going to use for GPS tracking / cloud information. I had to look through forums such as stack exchange, blues webpages, and youtube videos to figure out how to integrate the blues notecard with the Raspberry Pi. After we pivoted to the new GPS device there needed to be an understanding for UART connections on the Raspberry Pi, which I had to figure out through youtube videos. Then when working with the bluetooth devices, serial connections, and Arduino, I had to look through web pages , youtube videos, and webpages as well. However general knowledge such as wiring and usage of breadboards was previous knowledge learned from classes like 18-100 and 18-220.

PROGRESS:

Currently I have accomplished what was expected of me from this point, however more testing needs to be done which is what we are currently working through.

Next Week Deliverables:

Wholistic testing of the device to ensure things are moving as planned. As well as addition of additional features if things are working smoothly. We also need to finish the final poster and work on presentation slides.

Emmanuel’s Status Report for April 19th, 2025

WORK ACCOMPLISHED:

This week I worked on fixing the wristband case, creating the main device case, and testing the detection system.

I had to 3D print a our wristband case a couple times to fix the fit and closing mechanism that secures the circuit but still allows us to remove it if necessary. Ordered and received velcro straps so wristband is complete and can be attached to the wrist. However, I have yet to test how stable it is while riding a bike.

Additionally, I worked with forever to make our main device encasing through acrylic bending.  3D printing it was too expensive. I also drilled in the mount piece so the device can now be attached to a bike.

Lastly, towards the end of the week I spent time testing the blindspot detection system against bikes and cars while the sensor was stationary. Testing revealed the sensors field of view is even narrower than I thought as it requires objects to be almost directly in front of it to be detected. But, the time between detection and a vibration is fairly quick and meets our 1 second use case.

New Knowledge:

During every iteration of this project I learned something new. I learned more about interacting with different microcontrollers (RPis, Arduinos, etc), bluetooth  communication, use cases of various sensors, operating different CAD softwares, 3D printing, acrylic bending, and  the approach for screwing different materials together.  The most prevalent learning strategy I used was youtube videos along with trial and error. I also leaned on GitHub (for starting certain serial communications)  and TechSpark workers (3D printing, drill, and acrylic bending) for guidance.

PROGRESS:

I’m fairly on pace with my tasks right now, but I should have completed on bike tests this week. Finishing the encasings took more time than expected.

NEXT WEEK’S DELIVERABLES:

Next week, I aim to complete rigorous testing of the blindspot detection systems while it’s on a moving bike. I will also spend time working on the final presentation slides and poster.

Akintayo Status Report for April 12th, 2025

WORK ACCOMPLISHED:

In the past 2 weeks, I have been collaborating with members of the team in order to integrate the navigation and audio components with the entire system as a whole. Additionally, we were able to begin some testing of the audio input with GPS and navigation on a test route from Porter Hall to Phipps Conservatory. I have also worked on enabling the system to relay the navigation instructions to the user via a speaker. 

PROGRESS:

I slightly behind in regards to testing the subsystem and building out the audio feedback components. I will be working on catching up with this work this week.

NEXT WEEK’S DELIVERABLES:

For the upcoming week, we will be testing the GPS and navigation using sample GPS coordinates and real-time GPS coordinates while riding the bike. Additionally, there will be extensive testing of the audio aspect of the system for recognizing the user’s destination from their voice command. 

For the upcoming week, I will be primarily working on the audio feedback portion where the user can receive the navigation instruction via audio on a speaker and potentially a bluetooth headset. Additionally, validation and testing will be done to ensure that the navigation and audio feedback systems work as expected and provide the navigation instructions accurately and in a timely manner in accordance with the user design requirements.

TESTING:
1. For recognizing destination from voice commands:
testing with 5 different voices (from different people) for 20 different destinations within the Pittsburgh area and checking the accuracy from the output of the speech-to-text system
2. For accurate navigation instructions:
test multiple GPS coordinates on 10 different routes and check that the generated navigation instructions are accurate by comparing with the actual turns on a map. Additionally, I will be testing that the audio for the instructions works and returns the audio command with low latency

Team Status Report for April 12th, 2025

This week, we made solid progress on all subsystems of our project.

We were able to integrate all of our scripts and have them running simultaneously on the RPi4 . So, the GPS tracking, blindspot detection, and navigation instructions with voice recognition system are able to work together by using threading. The navigation script is able to get continuous updates of longitude and latitude from the GPS through a shared global variable.

There were many pivots with the wristband system. We replaced the Micro Arduino device for the wristband with the Blues Swan, which also has Arduino pairing capabilities. The reason for this change was due to the Swan including a PMIC accessible via a JST PH connector, this allows us to power the board with a LiPo battery but also recharge the battery if need be, through the USB port.  Additionally, we to changed our bluetooth system from operating with 2 HC-05 modules to using the RPi4 bluetooth and one HC-05 module because we were having trouble sending data with the previous setup. Lastly, one of the biggest changes we made was to switch from the ultrasonic sensors to the OPS243 Doppler Radar Sensor because during interim demo week the ultrasonic sensors were having connectivity issues. Also, it was apparent they would be insufficient in meeting in our use cases.

Our  audio and navigation components are integrated with our GPS tracking and we have been actively testing their accuracy and functionality by doing test routes ( ex. Porter Hall to Phipps Conservatory). Right now we have instructions being inputted manually but we are actively fleshing out kinks with the audio input.

We 3D printed our bike mount piece and got it to match the GoPro sizing and are actively working to have the wristband and main device encasing finished printed soon.

RISK:

In regards to the GPS subsystem, our encasing could potentially block satellite signals, if not positioned properly. Another separate risk, is the fact that HC-05 device and Bluetooth earbuds would need to be connected to the Raspberry Pi 4 simultaneously. Unfortunately, it may not be possible for the RPi to connect to multiple bluetooth devices. As a result, it is important that we spend sufficient time in attempting to integrate these bluetooth devices to the system to work at the same time. If that is not possible, then the audio aspects of the system will have to be done via a regular microphone and speaker that do not rely on Bluetooth connectivity. 

There is risk with our new sensors as well. There’s risk with the new OPS243 Doppler Radar Sensor because the field of view is very limited compared to the ultrasonic sensor. Although it’s fairly accurate in detecting incoming objects, its field of view to do so is very limited and poses a threat of objects being missed in a user’s blindspot. We might have to change or specify our guaranteed coverage zone in our final use case.

 

TESTING:

We are still early in our testing but overall it has been going well. We tested inputting a journey manually and having directions update based on a small sample set of GPS coordinates.  For our audio input portion we aim to test 5 different voices (from different people) for 20 different destinations within the Pittsburgh area and checking the accuracy from the output of the speech-to-text system. To test the navigation accuracy we’ll test multiple GPS coordinates on 10 different routes and check that the generated navigation instructions are accurate by comparing with the actual turns on a map. Our validation is ensuring that GPS coordinates are occurring in real-time and the navigation suggestion system is outputting the correct instruction.

In order to meet use case requirements such as users receiving audio instructions within 200 feet of a turn, the GPS system needs to accurately measure where the user is.  We’ve done a couple of bike trips to Phipps Conservatory and captured the longitude, latitude, and distance from turn for each of these trips to ensure the user is at the right location. Another test that has been done is putting the GPS sensor in a box, and reading GPS data while outside, to ensure that when the actual encasing is finished, we are not blocking GPS signals. We still need to test more locations and routes but so far testing as been going well and the results are aligning with our use case/

For the blind spot detection and wristband system, we have tested the basic functionalities. When stationary and indoors non moving objects infront of the sensors don’t trigger a wristband vibration but and incoming object at certain speed will.  Tested when objects are incoming at different angles relative to the sensor and it does have a more limited field of view than the ultrasonic sensors but it’s better at filtering out unnecessary objects. The wristband system is able to meet our use case in a limited setting. When the system is stationary, a vibration haptic feedback response is generated within a second of an incoming object being detected.  We still need to create an explicit plan to test the accuracy rate of the blindspot system. 

 

NEXT WEEK DELIVERABLES: 

We are primarily focused on extensively testing our subsystems independently.  We will  also work on getting all the encasings 3d printed to properly secure our project so that we can test our integrated systems on a bike.

 

Gantt