Kaya’s Status Report 03/22

Accomplishments this week:
This week, I configured the proper CV libraries to analyze our Lidar camera data. I did this by setting up a virtual environment with the proper libraries (Python3.7, OpenCV version 4.11.0).  Additionally, I configured our Jetson to analyze the distance data and wrote a script for this analysis.

Reflection on schedule:

We are slightly ahead of schedule now. We did a lot this week (object detection, distance detection, haptics).

We dedicated a lot to capstone this week, especially with the configuring of the libraries. We had to research and reconfigure the python and pip libraries to match the needs of the Lidar camera. After numerous tries of source installing and pip installing pylibrealsense, we came to the conclusion that we can’t use this library pylibrealsense. After trying other things out, we came to the conclusion that cv2 is the best python library for analyzing the camera data. Additionally, we came up with a separate command for getting distance data.

Plans for next week:

Work with Cynthia on writing a script to connect the distance detection with the computer vision code. Additionally, I plan on working with Maya on configuring the force sensitive resistors.

Maya’s Status Report 3/22

Accomplishments this week:
This week, I setup the haptics and created a few sample patterns that we will be using with our Jetson.  The haptic patterns for each obstacle type is demonstrated here.

Reflection on schedule:
We were a bit behind schedule at the beginning of the week because we had some problems with the Jetson and L515 compatibility, but we put in a lot of hours this week to recover from this. Personally, that included me helping Kaya with the Jetson and L515 connections, and I also setup the haptics and created case statements for when each haptic pattern is set off.

Plans for next week:
Over the next week, I will be working to run the haptics over the Jetson and hopefully begin to connect the haptic responses through the CV code, and I will begin testing the overall power consumption to make sure it is under 30W and 5V.

Cynthia’s Status Report

Accomplishments this week: I worked with my team to get our RealSense SDK visualizer working with getting the RGB and depth streams on the Jetson and I wrote the code to get the same streams through a Python script instead of the visualizer.

Reflection on schedule: We are a little behind schedule for the software, but also because we are doing things a different order than on our schedule.  We ended up doing the integration for the LiDAR camera with the Jetson this week, which we got working, but the correct versions that we need to run our python code needs to be worked on further, since our visualizer works for the data stream on the Jetson but not the Python code to view the stream because of the libraries.

Plans for next week: Over the next week I will be catching up on my object detection code by writing the code to obtain bounding boxes and getting it to work on the Jetson Nano.

Maya’s Status Report 3/15

Accomplishments this week:
This week we linked the Jetson and the L515 using the SDK viewer, which is pictured below, and we are continuing to set up the L515 on the Jetson with our current code instead of the viewer code.

Reflection on schedule:
We are on schedule for the most part, but we have a heavy workday on Sunday to finish our goals for this week, which includes starting to integrate the haptics with the Jetson.

Plans for next week:
Over the next week, we will be working on the haptic logic and making sure it integrates with the Jetson.

Kaya’s Status Report 3/15

Accomplishments:

This week, I managed to successfully configure the LiDAR L515 camera with the Jetson Orin Nano. I dowloaded all of the necessary libraries including the realsense compatible with our old camera onto the Jetson. Additionally, I mapped out all of the GPIO’s that we are going to use for our peripherals and wrote a script to turn on that pin.

Progress:

We are slightly behind schedule due to our group having a busy week with greek sing. We plan on doing extra work tomorrow to catch up and put us back on track.

Future deliverables:

I plan on working with Cynthia on the making of our computer vision yolo algorithm. Additionally, I will assist Maya on the connecting of our peripherals to the breadboards.

Team Status Report 3/8

Risks:

We have yet to attempt connecting the Jetson and L515, so that is a potential risk we may face, but we will be trying to do that this week so that we have ample time to problem solve if it does not work initially.

Changes:

The only change we have made is a new power supply due to our new power calculations. We did not realize that our computer vision would require our Jetson to be in Super Mode, which requires an additional 10W from what we had originally planned for. But we have found a new power source that supplies our required 5V, 6A.

A was written by Maya, B was written by Kaya and C was written by Cynthia.

Part A: Our cane addresses a global need for increased accessibility and independence for individuals with visual impairments. Around the world, millions of visually impaired people face mobility challenges that hinder their ability to safely navigate unfamiliar environments. The need for better mobility tools spans urban areas, rural villages, and developing areas, meaning it is not limited to any one country or region. Our design considers adaptability to different terrains and cultures, ensuring the cane can be valuable in settings from crowded malls to personal homes. By enhancing mobility and safety for people with visual impairments on a global scale, the product contributes to broader goals of accessibility, inclusivity, and equal opportunity.

Part B: Our cane addresses different cultures having varying perceptions of disability, independence, and accessibility. In communities with strong traditions of communal living, the single technology-advanced cane encourages seamless integration into these communities by drawing less attention and allowing users to maintain their independence.  Additionally, the haptic feedback system will allow for users to integrate seamlessly by drawing less attention by producing no noise from the device. By considering these cultural factors, our solution will allow for greater acceptance and integration into various societies.

Part C: We designed CurbAlert to take into consideration environmental factors, such as disturbing the environment around the user and interacting with the environment. Specifically, the feedback mechanism (haptic feedback) was chosen to notify only the user without creating extra noise or light or disturbing the surrounding environment or people. Additionally, our object detection algorithm is designed to detect hazards without physically interacting with the user’s environment and without having to be in contact with anything besides the ground and the user’s hand. Additionally, our prototype will be robust and rechargeable, making the product have no additional waste and making it so that a user will only need one of our prototype. By being considerate of the surrounding environment, CurbAlert is eco-friendly.

Maya’s Status Report 3/8

Accomplishments:

This week, we found out that our Jetson would consume more power than we had initially planned for, so I spent a lot of time researching new power options that met our 5V, 6A power requirements of a portable charger. Kaya and I also worked to set up the Jetson Nano. Lastly, I did a lot of the final documentation and diagrams of our Design Review.

Progress:

We are on schedule now that we have finished the Design Report, our Jetson initialization, and L515 camera set up.

Future deliverables:

Our adafruit order was delivered, so I will be able to start working on the haptic vibration motor and starting to create the logic for different feedback patterns. Since Kaya and Cynthia will be working together on the software behind the computer vision, I plan to focus on more of the Jetson and haptics.

Kaya’s Status Update 3/8

Accomplishments:

This week, I worked on finishing the initialization of the Jetson Nano. We got it fully displayed on a monitor and I set up numerous ways to connect to the Jetson without a monitor which include ssh configuration and VNC viewer for virtual display. I also updated the Jetson operating system and installed Jetpack. Lastly, I downloaded Jupyter and Pytorch and came up with a way to remotely accessed Jupyter in the Jetson through a local browser.

Progress:

We are on schedule now that we have finished the design report and have both our Jetson and L515 camera set up.

Future deliverables:

I plan on working on setting up the software/circuits for the force sensitive resistor and haptics. Additionally, I plan on working with Cynthia on the code for the computer vision.

 

Cynthia’s Status Report 3/8

Accomplishments:

This week I set up the L515 camera and obtained the depth stream along with the RGB stream. This took longer than expected because of compatibility issues due to my computer version, the SDK version, and the camera being outdated. A picture of the obtained streams is attached below. Additionally, I spent time improving our design report draft and adding more diagrams.

Progress:

We are on schedule now that we have finished the design report and have both our Jetson and L515 camera set up.

Future deliverables:

The week following spring break, I plan on writing code in the RealSense Software Development Kit to be able to obtain a constant stream of the distance from our camera to a grid of points I specify in the frame, and to start writing the depth filtering code.