Ishan’s Status Report for 02/24/2024

This week I worked on implementing multiple sensors with the Raspberry Pi after getting all the materials like the SD card, sensors, and other connecting devices. I ran into some issues with inconsistent measurements from the sensors, so I’ve been looking into ways to mitigate these issues or if it would be better to look at other devices like an Arduino potentially.

I am a little behind schedule as I need to start working on the Jetson and interfacing it with the Raspberry Pi, but I still have time to catch up before spring break. I’m going to order the Jetson by tomorrow and begin working on connecting the Jetson with the Raspberry Pi so that we’re ready to transfer data between the two devices. This shouldn’t be too difficult to complete as I’ve done the necessary research, so I’ll know how to proceed once I get the device.

Next week, I hope to complete the connection between the Raspberry Pi and Jetson as well as work out the inconsistent measurements of the ultrasonic sensors. Finally, I aim to begin researching filtering/prioritizing algorithms that we would run with our object detection model so that our device will prioritize objects that are in immediate danger to our users.

Ryan’s Status Report for 02/17/2024

This week, I finalized the YOLO v4 architecture for the neural network to train our object detection model. I also requested access to the Imagnet Dataset, and have decided to have Microsoft’s Common Objects in Context Dataset as a backup. I have also started coding some of the YOLO v4 architecture, and the design presentation.

My progress is on schedule as of this week.

Next week, I hope to finish coding up the neural network and also have access to the Imagenet dataset. I also plan to start working on the design paper due before spring break.

Ishan’s Status Report for 02/17/2024

This week, I worked on formatting the Raspberry Pi with our ultrasonic sensors. I wrote code that would continuously output distance results from the ultrasonic sensors for various objects of different sizes. These results were then outputted to my monitor where I tested the ultrasonic sensors’ functionality to see if they adhered to our requirements.

Yes, my progress is on schedule.

Next week, I hope to calibrate the ultrasonic sensors for objects of our choice and begin connecting the NVIDIA Jetson to the Raspberry Pi to transfer data between the two.

Oi’s Weekly Status Report for 17/02/2024

For this week, I focused on creating the IOS App and getting the basic initial functions to work. I am focusing on connecting with and identifying peripheral devices via bluetooth. I have also created an interactive UI wireframe on Invision studio for our IOS App!

I believe that my progress is on schedule.

For next week, I hope to improve the design of the IOS App and make the design much more smoother and also improve the UI. I will also delve into how to connect the Jetson to the app and get data from the Jetson.

Team Status Report for 02/17/2024

This week was a productive week for our team. We have no design or schedule changes.  However, there are a couple risks that we have to manage. This week we requested access to a very comprehensive object detection dataset Imagenet. However, the website is a little out of data and we are unsure if access will be granted in time. Our backup Dataset is the COCO Dataset. This upcoming week, we will also be working with the Ultrasonic sensors that are very fragile and can easily stop functioning as needed, We have ordered extra sensors in order to avoid any loss of progress.

A was written by Ishan, B was written by Oi and C was written by Ryan.

Part A:

Our product solution will meet a need for public safety. Our system will enable visually impaired people to navigate around any impeding obstacles or objects that they could encounter in an outdoor setting. Our design will filter objects that are not an immediate danger to the user like objects moving away or objects that are not directly impeding the user’s direction of motion. This means that we will prioritize objects moving toward the user and any obstacles that are directly impeding a user. We will do this through the use of ultrasonic sensors that will detect the objects and assert a respective distance from the user. Then, our image detection model will tell the user what object this is so that they have a better idea of the obstacle they’re encountering. So, to surmise the device will tell the user where and what the object is with respect to the user. This device is meant to accompany already existing navigation devices like the walking stick as most visually impaired people are comfortable using devices like this to detect any low-level objects. Our design will help to supplement devices like the walking stick by protecting against high-level/moving objects as well.

Part B:

Our headset is taking in the needs of the visually impaired community through interviews and user testing. We want to create a product that will be responsive to their needs and help them interact and navigate through their environment more easily by helping the users detect and identify obstacles around them easily. We plan to make our product more adaptable to visually impaired users with different needs through the different calibration settings for the headset for different comfort levels. We hope that our tool will be more accessible and also promote inclusivity.

Part C:

Our product is designed so that it isaffordable for most users. This requires us to minimize cost of the materials and the complexities of the build. There we are using low cost materials such as Raspberry Pis, small ultrasonic sensors, a jetson, and a custom IOS app.  The app should be free for users. In addition, through the use of a Jetson to run our object detection model, we have minimized recurring cloud expenses (ie. aws). The different componenst we have chosen also inteact with each other at ease, allowing us to simplify the product build lowering production costs.

Ryan’s Status Report for 02/10/2024

This week began with a focus on the Proposal Presentation. I worked with my team to refine the slides and rehearse the presentation. We met on Monday and Tuesday to rehearse as we presented on Wednesday.

In addition,  I focused on researching different neural networks to train an object detection algorithm and searching for good data sets of commonly found objects outdoors to train our model. From my research, the YOLO network seems to be good for fast object detection, and there are a few annotated data sets from Google that seem promising. I also spoke with the principal of a School for the Visually Impaired to better understand the needs of the visually impaired.

My progress as of now is on schedule. I will be finalizing the dataset and network architecture in the next few days and start coding the network to test it using a smaller dataset.

Team Status Report for 02/10/2024

We spent this week mostly fine-tuning our proposal presentation and doing further research on the components of our project and how they’ll interact with each other. We also interviewed the principal at WPSBC (Western Pennsylvania School For Blind Children) to get a better understanding of what design preferences visually impaired people have with regard to guidance systems. Furthermore, we discussed the different ways visually impaired people prefer to receive information about their surroundings (i.e vibrations, audio, etc).

One of the risks that could jeopardize the success of the project is faulty and short-circuited materials and to combat this we plan to order multiple iterations of each material needed to ensure all components will be functioning correctly.

No changes have been made to the existing design of the system as of now.

No changes have been made to our current schedule.

Oi’s Weekly Status Report for Feb 10th

For this week, I focused on making the presentation slides and practicing and rehearsing for the proposal presentation to my section.

I am currently researching and looking into ways to integrate the Jetson with the IOS App and learning more about XCode for IOS App creation!

I believe that my progress is on schedule.

I hope to understand XCode more and create a basic functioning app if possible for next week.

Ishan’s Status Report for 02/10/2024

This week I focused on researching and understanding how to transmit data from the Raspberry Pi to the NVIDIA Jetson. There will be a lot of data communicated from the Ultrasonic sensors and camera, and I researched how this data will be communicated across both platforms. As we will be ordering our materials this week, we wanted all group members to be prepared to start working on this project immediately, so we mainly focused on researching and getting a more tangible understanding of our project.

Furthermore, I began to look at different materials to communicate data across these two devices.  For example, I looked into using WIFI, Serial UARTs, sockets, or ethernet cables, but figured that because we are dealing with sending real-time images and video,  an ethernet cable may be the better option as it’s more reliable at dealing with large amounts of data.

My progress as of now is on schedule as I now know how the data communication will work between the Raspberry Pi and the NVIDIA Jetson. Next week, my goal is to test the ultrasonic sensors with various objects across different distances to gain a better understanding of the limitations of the ultrasonic sensors.