Colin’s status report for 12/10/2022

This week I spent some time preparing for the final presentation and gathering the hardware info needed to determine if we met our use cases. I also worked on the software needed for our final demo. We plan to show the finite state machine that Zach has been working on by walking a route outside of Hamerschlag and providing feedback to the users. This involved gathering the location data of the turns that we will be making in the demo and making some feedback. I created a pseudo-route for the demo that imitates the routes given back by the HERE API so that Zach’s interpreter code can provide feedback like a normal route. We are also trying to figure how to add a re-routing aspect into the demo using another pre-configured route.

Next week after the demo Zach and I will be working on our final report and video. We hope to be able to develop some snap-to-sidewalk code before the final report so that we can have a full re-routing aspect to project that coordinates with the HERE API.

Colin’s status report for 12/3/2022

This week I spent most of my time fine-tuning the location accuracy of the device, by changing configuration settings and experimenting with which ones provide the best accuracy. We chose a GPS device with an integrated inertial measurement unit to be able to attempt to get better location accuracy. After a lot more research and testing, I came to realize that the device we are using is not meant for pedestrian use, and is really meant for uses where wheels are involved and the speed of the wheels can be measured separately and fed into the device for dead-reckoning purposes. The IMU needs to be calibrated and the calibration tests are different depending on the mode of operation that the device is in. Most of the modes require higher speed for more accurate results, and the modes that do not require higher speeds require wheel-tick sensors, which we do not have because of our pedestrian use-case. This results in the IMU providing highly inaccurate data after a couple of minutes of walking, so I found that just using the location data from the satellites provides fairly good accuracy as shown in the picture below. For some configurations like the one I was using, the device uses a low-pass filter on the location data to get better accuracy. It can be observed that I was on the sidewalk the entire time.

The problems start to come in when the connection to the satellites gets worse. The system works well when out in the open, providing enough accuracy to be able to make a decision as to what side of the street we are on, the average urban street width is ~3 meters (10 feet). With measured accuracy of about 1 meter out in the open, our device is accurate enough to determine what side of the street we are on. When connection counts to the satellites drops, we encounter a lot of noise, too much to be able to determine what side of the street we are on. In the photo below, I was walking on the south side of Winthrop St. up against a few tall buildings, which tricked the device into thinking that I was walking up the middle of the street, when in reality I was still walking up the same sidewalk on the south side of the street. This would be a great situation where dead-reckoning with the IMU would come into play, however due to our low speeds and our device not being rigid enough, the IMU data hurts us more than helps us.

The location problems may be able to be fixed by adding some hysteresis to the code that determines what side of the street we are on, however in a dense urban environment, more we need to figure out how to incorporate the IMU data better because we are not currently accurate enough for our use case requirement of 1 meter accuracy.

I also made some hard-coded routes for us to test with and potentially use in our final demo if we cannot figure out a way to route to sidewalks by the time the project is due.

Next week I will give one final shot at trying to get better accuracy using the IMU, and if I cannot get anything better, we will have to settle for what we have right now. Zach and I are also going to try to wrap the project up and come up with a good demo plan for the final presentation, possibly with a hard-coded route. I am also going to finalize our use-case measurements on the hardware side and see which ones we made and which ones we did not make.

Colin’s Status Report for 11/19/2022

This week I focused on finishing up with the hardware for the project. I can interface with all of the necessary hardware to be able to gather/output everything that we need. I developed a simulation test bench so that Zach will be able to test his routing code with some pre-configured route data. We then hope to be able to take the device on a walk soon and be able to see the device giving feedback when necessary to be able to walk a full route.

Next week I plan to order the full battery that we plan on using, as well as adding a 3.5mm audio jack to the outside of the device so that users can plug their preferred audio devices into the box for feedback purposes. I also plan to work closely with Zach in order to incorporate re-routing capabilities as well as improving our accuracy as much as possible.

Colin’s Status Report for 11/12/2022

This week I worked on the system-level implementation of our project as well as improving the accuracy of the GPS unit. I played around with some configurations in the chip and changed to a configuration better suited for a person walking as opposed to a driving configuration which is default. With this different configuration, I gathered some extra route data that Zach and I can use to simulate a walk which should allow us to quickly debug and test our routing/feedback code.

I spent the majority of my time this week working on interfacing with the cellular data card, using the text-to-speech engine and outputting the audio form the pi to the user. By the end of next week I hope to be done with most of the system admin/device development so that Zach and I can collaborate more on incorporating the routing and feedback functionality into the device.

I finished most of what I hoped to do this week and I am happy with where I am on the schedule. At this point going forward I will be spending the majority of my time helping with the routing functionality and incorporating that into the pi.

Colin’s Status Report for 11/5/2022

This week I made a lot of progress regarding building the device and gathering our first bit of test data. I decided to build the first (and possibly final) iteration of the device out of a lightweight aluminum casing material. This material provides strong structural support while also allowing us to hit our use case weight requirement. The aluminum also acts as a ground plane for our antennas, and I took particular care to make the dimensions of the box work well with the GPS antenna. The main frequencies that we will be using are 1.2GHz – 1.5GHz, which is a wavelength of about 10 inches. I made the dimensions of the box 10 inches (wavelength) by 5 inches (wavelength / 2) to attempt to get a better resonance and help with noise. I will look further into seeing if it is possible if we can use a particular band that would resonate better with the case because 1.2GHz – 1.5GHz is a fairly large range and it is impossible to tune to all frequencies in that range.

The case with all electronics not including the battery weighs ~18 ounces and the battery that we plan on using weighs ~14 ounces. This gives us a total weight of 32 ounces, which is under our use case requirement of ~35 ounces (1kg).

I used a 4000 mAh battery that I had to test with, however it will not be the battery that we plan on using in the end. For the 1-2 hours that I was using the battery, it only used up about 1/3 of it’s power which is a good sign and means that with the 26800 mAh battery that we plan to use, we should be able to hit our use case requirement of 16 hours of battery life.

I walked a route with the device to see if it gathers somewhat accurate data. Below is the route I walked, I started going down Forbes Ave towards the soccer field and then crossed the road and walked back up Forbes Ave. I gathered a location point once every second and used gpsvisualizer.com to visualize the data. It is obvious that I was walking down one side of the street and back up the other side of the street which shows that we have fairly good location accuracy. There are more settings and ways to increase the accuracy on the ZED-F9R GPS unit that I have not had a chance to change but they should get us even better results.


(The dotted line paths on the map is not data gathered by us, they are markers on the map for other purposes)

This week I accomplished most of my goals of gathering test data and building an iteration of the physical device to allow us to start gathering data. I did not end up buying the battery that we plan on using because I found one lying around that I was able to use to test with, and I will buy the real battery later on because the one that I have right now works well for testing. Next week I would like to collaborate more with Zach so that we can start incorporating more of the routing code into the device. I would also like to do more analysis on the heading of the device and see how accurate it is. The GPS device has different modes depending on the application, and I believe that if I use a different mode our results could be more accurate for the purposes of walking, so I would like to experiment with the GPS unit more next week.

Colin’s Status Report for 10/29/2022

This week all of our parts that we ordered came in and I was able to connect all of them to the RPI. I mainly worked on interfacing with the GPS/IMU. Zach and I think it would be a good idea if we could have something running by the end of this week that would allow us to gather location data while walking around. This way we can walk some routes and save the data so we can simulate walking the route at a later time for development purposes without having to physically be walking the route whenever we want to test.

This brings up the need for some mobile system that includes a power supply and a case to protect all of the components. I should be able to throw together some sort of case quickly, however I need to get the battery ordered. I did some research about batteries for the design report and found this 26800 mAh battery: https://www.amazon.com/Charmast-26800mAh-Portable-Li-Polymer-Compatible/dp/B07P5ZP943. The battery outputs 5v which is how much voltage the RPi runs at, and since our design constraint of battery life is 16 hours, this gives us 8.375 Watts maximum. The RPi uses about 5 Watts at 100% CPU load, and the peripherals will use less than 2 Watts combined, giving us about 7 Watts of power being used. This estimate is on the high end and we may be able to go much lower with careful attention to minimizing power consumption. This battery will give us an estimated 1.375 Watts of slack which is a comfortable amount, especially because I do not want to rely on the battery performing properly at less than 5% charge.

My goals and schedule have somewhat changed from our initial thoughts, with the need to get some real data quickly, I am now focusing more on the construction of the device and the battery. I aim to receive a battery next week and perform some route walks while gathering the location data along the way so that we can do testing/development. I will be somewhat ahead of schedule regarding the battery and device development, however I will be going slightly behind on schedule regarding the wireless communication aspect of the device in order to focus on the location data collection.

Colin’s Status Report for 10/22/2022

This week I accomplished everything that I wanted to based off of my prior status report. I wanted to get all of our hardware ordered (except for the battery) so we could begin to experiment with the equipment as well as working out a framework for the software to run on the Raspberry Pi. Zach and I also spent a lot of time on the design review report.

Most of my time this week was spent on developing the software to run on the RPi. My main goal was to come up with a system where all three of our main threads could run while communicating with each other. I decided to add a fourth overall thread to the system which would be the controlling thread. This thread tells the other three threads when to run and handles the data communication. Since we will be using a single process and asyncio in python to be able to run threads, we do not have to worry about concurrency issues when communicating data since only one thread will be running at once. The controlling thread will initially tell the location thread to gather location data, and then put that data into a buffer. Then the second thread that will be interpreting the location data and communicating with the API’s will be called. That thread will be given the buffer to take the location data and then use it to determine what sort of feedback to give to the user. The feedback will then be put into a separate buffer and given to the third thread by the controlling thread. The third thread will then run the text-to-speech engine and output to the 3.5mm audio jack for the user to listen to. This process will continue until the user has reached their destination.

Next week I would like to hook up our components that have been ordered into the physical system. I would like to communicate to the components through the software and be able to gather the necessary location data to be able to know where the user is. I would also like to hook up the software to the blues wireless notecard API to be able to communicate to our external directions API. If I could get all of this done next week I would be on schedule and I would be able to begin to collaborate with Zach in order to start getting the skeleton of our entire system working. We would hopefully be able to have the hardware supplying data to the directions thread which would allow us to work out the next steps to a fully functional system.

Colin’s Status Report for 10/8

This week I worked on figuring out the compatibility of the parts that we want to use on the project. Initially, we thought about using two RPIs, and dedicating one to the front end and one to the back end. The only reason for this would be to make the GPS and cellular data card easier to hook up to the system. However, the increased development time and complexity of two RPIs communicating data to each other is not worth it. I did some research on the data card and the GPS and have determined that we can hook both of them up to one RPi. Since we aren’t going to be anywhere near the computational limits of the RPi, it seems as if the most logical route to take. The GPS unit has the ability to change it’s I2C address, so we can run both the GPS unit and the cellular data card on the same I2C lines. An alternative would be to communicate to the cellular data card via I2C, and the GPS via UART if problems arise.

I also did research on the environment we will be running our tasks in. I originally contemplated scheduling 3 separate python processes on the RPi kernel, one for GPS data gathering and filtering, another for the backend, and another for the audio output, however the communication of the data to and from each process is not simple. An easier way to do this would be through the use of one single python process utilizing asyncio to perform cooperative multitasking with each of the 3 threads. Since we are not bound by computation power, we do not need to utilize all of the cores on the RPi, and this would allow for data communication between the threads to be much simpler. We also do not have very hard RTOS requirements, so we do not need preemption if we work out a simple cooperative scheduling scheme. Any extra development time that can be taken away from the environment that can be put towards the main direction algorithm of the project will be very useful for us.

I am doing okay in terms of the schedule. I accomplished a lot of what I wanted to in terms of figuring out exactly what parts to use and how they will communicate with each other. I have ordered the RPi from the ECE inventory, and have figured out what we will be running on the RPi in terms of software. Something that I would have liked to get done was to actually receive the RPi last week, however I was not able to and I will be doing so Monday morning.

Next week I need to get a few things done. The first would be to set up python on the RPi and start on the frameworks for all of our threads to communicate with eachother. The most important goal for next week is to order all of the extra parts that we will need for the project. Those parts would be the GPS/IMU, cellular data card, antennas for both of those parts, and some wires to hook up the devices to the RPi.

Colin’s Status Report for 10/1

This week our team altered our project to now provide directions along blind-friendly routes to aid the visually impaired. Due to Eshita dropping the class, Zach and I lack the machine learning knowledge to be able to proceed with the prior design.

I will now be focusing on the front-end of our system. I will be using a Raspberry Pi to gather data from a GPS unit to be able to determine the user’s location. The SparkFun GPS-RTK Dead Reckoning pHAT board appears to be a good unit for the project. The unit attaches to an RPi4 through the 40 pin header, and is easily interfaced with I2C. The unit contains a GPS locator, and an IMU to provide more accurate position readings when a loss of GPS signal is encountered. The unit has heading accuracy of within 0.2 degrees, however the unit does not contain a magnetometer. It achieves this by relying on the GPS moving, combined with accelerometer readings. This may be a potential problem for us given that our user may be standing still for a long period of time, and the heading reading will be prone to drift without the user moving in a direction. A solution to this would be to add a secondary board with a magnometer to tell direction, however this may not be necessary will significantly increase complexity of the unit because we would no longer be able to use the PiHAT 40 pin connector for the GPS and we would have connect both boards to the RPi, sharing the header.

I will also be taking commands from the back-end Pi to give directions to the user via audio. I will be using a Text-To-Speech engine to tell the user where to go and give various status updates given from the back-end Pi. The RPi4 comes with a 3.5mm audio jack capable of outputting audio to a wired ear bud which the user will be able to hear the directions from.

I am currently behind schedule given that our team is re-designing the project, however I am very happy about the new direction of the project. In the past day we have been focusing heavily on the project and will continue to do so in order to have a good design done by Sunday for the design review.

Colin’s Status Report for 9/24

This week I did research about all aspects of the hardware for the project. I wanted to tie in all of the components at a high level and see how they would all interact together in the project. In particular, I have decided to go with a BMA180 accelerometer to feed in data into the Jetson to determine if the user is stationary or walking. I can use a python library for this particular accelerometer to get the data, or I can write a small c program to gather the data and run some algorithms to determine the state of the user. I figured it would be nice to be able to easily gather the data using python given that we will be using python for the object detection portion of the project, and that the data from the accelerometer must be communicated to the object detection portion. I believe that doing both of these in the same python code would significantly increase both robustness and speed of development. I have also been looking into cameras that can stream data to the Jetson, and I believe that the SainSmart IMX219 would work well with a Jetson Nano, which is what we plan on using. Currently, I am on track according to the schedule given that for now all of us are working towards the design proposal, and the work that I have done this week all has to do with the design of the project on the hardware side. My primary goal next week is to look into the options involving the audio communication to the user of whether or not they should cross the street and what direction to go. I would also like to receive a Jetson Nano within the next week and start to install python/OpenCV on it. When installing python, I would also like to look into the option of building a multi-threaded python program to be able to get the accelerometer data at regular intervals and to communicate that data to the thread that decides whether to look at the walk sign detection or the crosswalk detection.