Team Status Report for 02/24/2024

This week was a productive week for our team. We finished the design presentation proposal together and will look into feedback and incorporate them! Right now, we are trying to order the Jetson and also get parts from our 18500 kit! We are also waiting for access to  ImageNet database, which is out of our control right now. We hope that this turns out smoothly, as if not, we will have to wait longer for the parts to arrive and access to be given in order to test things out. While we wait, we plan on making the most out of our time by working on tasks that we do not have to wait on materials or access for.

 

We have no design or schedule changes right now.

Ryan’s Status Report for 02/24/2024

This week, I finished the design presentation and most of the code for the Yolo v4 network. I am still waiting for access to the Imagenet database, but I plan to use Microsoft’s Common Objects in Context Dataset. I have also started compiling a small dataset from Microsoft’s Common Objects in Context Dataset to test the network and fine-tune parameters for training. I have also started researching different camera modules and hope to finalize on one soon.

My progress is on schedule as of this week.

Next week, I hope to finish coding up the neural network and also have access to the Imagenet dataset. I also plan to train a model using the mini dataset to test the network, Finally, I will work on the Design Paper as well.

Oi’s Status Report for 02/24/2024

During this week, I focused on learning more about the Core Bluetooth of Swift. I learned about how to find peripherals nearby to my central IOS device and also how to connect to them successfully. I also researched and delved into ways to send data from the NVIDIA Jetson to the IOS App, which I will use bluetooth connection and through several Python modules to help! I’ve tested the connectivity with my IOS app from iPhone to connect to my laptop. I can see some information from my laptop on my iPhone. I have also worked on creating layouts and pages to notify the user of different statuses (finding objects, connection lost, etc.)

I believe that my project is on schedule.

For the next week, I plan on making sure that I can create a request to connect notification message to the peripheral and find more ways to send more types of data to my ios app. I will also be working and expanding on the UI/UX features of my IOS app. I also plan on working on the design paper due before spring break!

Ishan’s Status Report for 02/24/2024

This week I worked on implementing multiple sensors with the Raspberry Pi after getting all the materials like the SD card, sensors, and other connecting devices. I ran into some issues with inconsistent measurements from the sensors, so I’ve been looking into ways to mitigate these issues or if it would be better to look at other devices like an Arduino potentially.

I am a little behind schedule as I need to start working on the Jetson and interfacing it with the Raspberry Pi, but I still have time to catch up before spring break. I’m going to order the Jetson by tomorrow and begin working on connecting the Jetson with the Raspberry Pi so that we’re ready to transfer data between the two devices. This shouldn’t be too difficult to complete as I’ve done the necessary research, so I’ll know how to proceed once I get the device.

Next week, I hope to complete the connection between the Raspberry Pi and Jetson as well as work out the inconsistent measurements of the ultrasonic sensors. Finally, I aim to begin researching filtering/prioritizing algorithms that we would run with our object detection model so that our device will prioritize objects that are in immediate danger to our users.

Ryan’s Status Report for 02/17/2024

This week, I finalized the YOLO v4 architecture for the neural network to train our object detection model. I also requested access to the Imagnet Dataset, and have decided to have Microsoft’s Common Objects in Context Dataset as a backup. I have also started coding some of the YOLO v4 architecture, and the design presentation.

My progress is on schedule as of this week.

Next week, I hope to finish coding up the neural network and also have access to the Imagenet dataset. I also plan to start working on the design paper due before spring break.

Ishan’s Status Report for 02/17/2024

This week, I worked on formatting the Raspberry Pi with our ultrasonic sensors. I wrote code that would continuously output distance results from the ultrasonic sensors for various objects of different sizes. These results were then outputted to my monitor where I tested the ultrasonic sensors’ functionality to see if they adhered to our requirements.

Yes, my progress is on schedule.

Next week, I hope to calibrate the ultrasonic sensors for objects of our choice and begin connecting the NVIDIA Jetson to the Raspberry Pi to transfer data between the two.

Oi’s Weekly Status Report for 17/02/2024

For this week, I focused on creating the IOS App and getting the basic initial functions to work. I am focusing on connecting with and identifying peripheral devices via bluetooth. I have also created an interactive UI wireframe on Invision studio for our IOS App!

I believe that my progress is on schedule.

For next week, I hope to improve the design of the IOS App and make the design much more smoother and also improve the UI. I will also delve into how to connect the Jetson to the app and get data from the Jetson.

Team Status Report for 02/17/2024

This week was a productive week for our team. We have no design or schedule changes.  However, there are a couple risks that we have to manage. This week we requested access to a very comprehensive object detection dataset Imagenet. However, the website is a little out of data and we are unsure if access will be granted in time. Our backup Dataset is the COCO Dataset. This upcoming week, we will also be working with the Ultrasonic sensors that are very fragile and can easily stop functioning as needed, We have ordered extra sensors in order to avoid any loss of progress.

A was written by Ishan, B was written by Oi and C was written by Ryan.

Part A:

Our product solution will meet a need for public safety. Our system will enable visually impaired people to navigate around any impeding obstacles or objects that they could encounter in an outdoor setting. Our design will filter objects that are not an immediate danger to the user like objects moving away or objects that are not directly impeding the user’s direction of motion. This means that we will prioritize objects moving toward the user and any obstacles that are directly impeding a user. We will do this through the use of ultrasonic sensors that will detect the objects and assert a respective distance from the user. Then, our image detection model will tell the user what object this is so that they have a better idea of the obstacle they’re encountering. So, to surmise the device will tell the user where and what the object is with respect to the user. This device is meant to accompany already existing navigation devices like the walking stick as most visually impaired people are comfortable using devices like this to detect any low-level objects. Our design will help to supplement devices like the walking stick by protecting against high-level/moving objects as well.

Part B:

Our headset is taking in the needs of the visually impaired community through interviews and user testing. We want to create a product that will be responsive to their needs and help them interact and navigate through their environment more easily by helping the users detect and identify obstacles around them easily. We plan to make our product more adaptable to visually impaired users with different needs through the different calibration settings for the headset for different comfort levels. We hope that our tool will be more accessible and also promote inclusivity.

Part C:

Our product is designed so that it isaffordable for most users. This requires us to minimize cost of the materials and the complexities of the build. There we are using low cost materials such as Raspberry Pis, small ultrasonic sensors, a jetson, and a custom IOS app.  The app should be free for users. In addition, through the use of a Jetson to run our object detection model, we have minimized recurring cloud expenses (ie. aws). The different componenst we have chosen also inteact with each other at ease, allowing us to simplify the product build lowering production costs.

Ryan’s Status Report for 02/10/2024

This week began with a focus on the Proposal Presentation. I worked with my team to refine the slides and rehearse the presentation. We met on Monday and Tuesday to rehearse as we presented on Wednesday.

In addition,  I focused on researching different neural networks to train an object detection algorithm and searching for good data sets of commonly found objects outdoors to train our model. From my research, the YOLO network seems to be good for fast object detection, and there are a few annotated data sets from Google that seem promising. I also spoke with the principal of a School for the Visually Impaired to better understand the needs of the visually impaired.

My progress as of now is on schedule. I will be finalizing the dataset and network architecture in the next few days and start coding the network to test it using a smaller dataset.

Team Status Report for 02/10/2024

We spent this week mostly fine-tuning our proposal presentation and doing further research on the components of our project and how they’ll interact with each other. We also interviewed the principal at WPSBC (Western Pennsylvania School For Blind Children) to get a better understanding of what design preferences visually impaired people have with regard to guidance systems. Furthermore, we discussed the different ways visually impaired people prefer to receive information about their surroundings (i.e vibrations, audio, etc).

One of the risks that could jeopardize the success of the project is faulty and short-circuited materials and to combat this we plan to order multiple iterations of each material needed to ensure all components will be functioning correctly.

No changes have been made to the existing design of the system as of now.

No changes have been made to our current schedule.