Colin’s Status Report for 11/19/2022

This week I focused on finishing up with the hardware for the project. I can interface with all of the necessary hardware to be able to gather/output everything that we need. I developed a simulation test bench so that Zach will be able to test his routing code with some pre-configured route data. We then hope to be able to take the device on a walk soon and be able to see the device giving feedback when necessary to be able to walk a full route.

Next week I plan to order the full battery that we plan on using, as well as adding a 3.5mm audio jack to the outside of the device so that users can plug their preferred audio devices into the box for feedback purposes. I also plan to work closely with Zach in order to incorporate re-routing capabilities as well as improving our accuracy as much as possible.

Team status report 11/19/2022

A hard problem we are facing this week is coming up with a heuristic to determine whether a crosswalk can be avoided or not. If a route has multiple blind unfriendly crosswalks, choosing to avoid or not avoid one can have consequences on subsequent unfriendly crosswalks as well. Currently, we think a flat rate distance may be the right option (ie. avoiding blind-unfriendly crosswalks should not increase trip length by X meters in total). If we do it by crosswalk instead of in total (avoiding blind unfriendly crosswalk should not increase trip length by more than X distance of percentage), this may cause the aggregate trip length to be very inflated, even if each individual crosswalk does not increase trip length by that much.

Next week we plan on doing more integration and testing as a group, as we need to make sure that the software and hardware components are working together well, and that we are covering edge cases properly. We hope to be able to run a full cached route with feedback by the end of next week.

Zachary’s status report 11/19/2022

This week i worked on finishing up code for how to direct users/verbal commands for users, and also outputting that feedback using a text-to-speech API (pyttsx3). I also wrote code to orient users at the start of the trip. This is to help users know which direction they need to rotate to begin the trip (left, slight left, right, slight right, turn around, no action). Tomorrow, I will meet with Colin to do integration and testing for our mini-demo on Monday.

Next week I want to start implementing two of the more advanced concepts, which are redirecting, and heuristics for whether to avoid an unfriendly crosswalk (distance based).  I will also start doing testing by writing a suite of units tests for my code.

Zachary’s status report 11/12/2022

This week I spent some time setting up a basic communication protocol with Colin for the interim demo. We were able to get good feedback from the instructors about what issues need to be solved, which I’ll talk about at the end. I also continued working on the response request for the backend. particularly retuning a string to indicate how far the user has left till the next action (ie. turn). If the next action is <5m away, I instruct them to prepare to do the next action. Once their distance from the previous checkpoint is >5m, and their distance to the next checkpoint is decreasing, I increment my cache index, indicating that the user is on the next portion of the trip.

A critical piece of feedback we got this week was, how do we know which of the street we’re on, and if there is a crosswalk when we turn? I have been doing reading on the HERE API to address this problem,  but have not been able to see a feasible solution so far, as most features on the API, such as this, are focused on car routing instead of pedestrian routing. Another alternative I’ve been considering is reviewing other APIs, such as Google Maps, to see if they potentially have functionality that can solve this issue. In the worst case, if we are unable to solve this, the most brute force way to solve this issue is to assume that on any checkpoint, the user may have to cross a crosswalk. That is, we would tell the user that they may have to cross an intersection, and to use their walking cane/other senses to determine if they actually need to.

Team Status Report 11/12/2022

This week each of our team members spent most of their time polishing up their portions of the project and now plan to spend most of the remaining time collaborating and working on the routing/feedback features of the device. Colin finished most of the system level work to be able to output audio and interface with the cellular data network while on the go. Zach worked more on the routing/feedback functionality to be able to give good user feedback.

One of the hardest problems that our team is facing right now is the fact that we only have to avoid blind-unfriendly intersections if the user has to cross a crosswalk, otherwise the user could just continue on the sidewalk and turn down a different street. The APIs that we are using right now make it hard to incorporate that feature into our system so we are looking into the possibility of finding some sort of routing library that could help with this problem. As a last resort, our system would still be functionally correct if we did not route the user to these intersections, even if they did not have to cross a crosswalk, it may just create a more convoluted route for the user.

Next week we plan to incorporate the feedback into the physical device and run some simulations to be able to test the routing capability of the device, and hopefully have a device that we could walk a route and get feedback in real time.

Colin’s Status Report for 11/12/2022

This week I worked on the system-level implementation of our project as well as improving the accuracy of the GPS unit. I played around with some configurations in the chip and changed to a configuration better suited for a person walking as opposed to a driving configuration which is default. With this different configuration, I gathered some extra route data that Zach and I can use to simulate a walk which should allow us to quickly debug and test our routing/feedback code.

I spent the majority of my time this week working on interfacing with the cellular data card, using the text-to-speech engine and outputting the audio form the pi to the user. By the end of next week I hope to be done with most of the system admin/device development so that Zach and I can collaborate more on incorporating the routing and feedback functionality into the device.

I finished most of what I hoped to do this week and I am happy with where I am on the schedule. At this point going forward I will be spending the majority of my time helping with the routing functionality and incorporating that into the pi.

Zachary’s status report 11/5/2022

This week I began writing code to handle requests from the frontend while a person is on the trip. Specifically, I used an index to refer to the location in the cache, where the next checkpoint is for the individual, and I would take the coordinates given by the frontend to calculate the Haversine distance, which converts coordinates to meters. This is not completely done, because I still need to account for when I can increment the index, edge cases like telling the user that they need to wait for a (blind-friendly) crosswalk, and I need to come up with textual feedback to give back to frontend. I was also able to debug the Jupyter notebook app and display the routes I am planning for the user. I am also seeing that given a blind-unfriendly crosswalk, the route is successfully avoiding that crosswlak, which is good.

Tomorrow, I will work with Colin to implement a basic communication protocol between frontend and backend, for the interim demo on Monday.

Next week, I want to finish the request handling module and work more with Colin to develop the communication protocal between frontend and backend.

Team Status Report 11/05/2022

This week our team was preparing for the interim demo that will be coming up next week. We built a physical device that contains all of the necessary hardware which will be capable of meeting the use case requirements. The device is able to generate geo-location data (latitude, longitude, and heading) along with a timestamp for each of the points. This data will be transmitted to the back-end routing thread to perform analysis on where the user currently is and where they need to go next.

For the demo next week we will show the physical device along with a visualization of a path that we walked with the device along some sidewalks. We also plan to show the ability for the device to generate a route that does not contain blind-unfriendly crosswalks, based off of the current coordinates of the device as a starting location.

Colin’s Status Report for 11/5/2022

This week I made a lot of progress regarding building the device and gathering our first bit of test data. I decided to build the first (and possibly final) iteration of the device out of a lightweight aluminum casing material. This material provides strong structural support while also allowing us to hit our use case weight requirement. The aluminum also acts as a ground plane for our antennas, and I took particular care to make the dimensions of the box work well with the GPS antenna. The main frequencies that we will be using are 1.2GHz – 1.5GHz, which is a wavelength of about 10 inches. I made the dimensions of the box 10 inches (wavelength) by 5 inches (wavelength / 2) to attempt to get a better resonance and help with noise. I will look further into seeing if it is possible if we can use a particular band that would resonate better with the case because 1.2GHz – 1.5GHz is a fairly large range and it is impossible to tune to all frequencies in that range.

The case with all electronics not including the battery weighs ~18 ounces and the battery that we plan on using weighs ~14 ounces. This gives us a total weight of 32 ounces, which is under our use case requirement of ~35 ounces (1kg).

I used a 4000 mAh battery that I had to test with, however it will not be the battery that we plan on using in the end. For the 1-2 hours that I was using the battery, it only used up about 1/3 of it’s power which is a good sign and means that with the 26800 mAh battery that we plan to use, we should be able to hit our use case requirement of 16 hours of battery life.

I walked a route with the device to see if it gathers somewhat accurate data. Below is the route I walked, I started going down Forbes Ave towards the soccer field and then crossed the road and walked back up Forbes Ave. I gathered a location point once every second and used gpsvisualizer.com to visualize the data. It is obvious that I was walking down one side of the street and back up the other side of the street which shows that we have fairly good location accuracy. There are more settings and ways to increase the accuracy on the ZED-F9R GPS unit that I have not had a chance to change but they should get us even better results.


(The dotted line paths on the map is not data gathered by us, they are markers on the map for other purposes)

This week I accomplished most of my goals of gathering test data and building an iteration of the physical device to allow us to start gathering data. I did not end up buying the battery that we plan on using because I found one lying around that I was able to use to test with, and I will buy the real battery later on because the one that I have right now works well for testing. Next week I would like to collaborate more with Zach so that we can start incorporating more of the routing code into the device. I would also like to do more analysis on the heading of the device and see how accurate it is. The GPS device has different modes depending on the application, and I believe that if I use a different mode our results could be more accurate for the purposes of walking, so I would like to experiment with the GPS unit more next week.