Team Status Report 4/12

What are the most significant risks that could jeopardize the success of the
project? How are these risks being managed? What contingency plans are ready?

The risks that could jeopardize the success of the project is the battery not working for the Jetson Orin Nano and it possibly frying our Jetson. We performed extensive research on this to make sure it won’t fry it but for the small chance that it will, we backed up all of our code on github and have all of the individual components. working seperately

• Were any changes made to the existing design of the system (requirements,
block diagram, system spec, etc)? Why was this change necessary, what costs
does the change incur, and how will these costs be mitigated going forward?
• Provide an updated schedule if changes have occurred.
• This is also the place to put some photos of your progress or to brag about a
component you got working.

There have been no chances to the existing design. There have been no changes to the schedule.

We have finished composing our cane!

 

Now that you have some portions of your project built, and entering into the verification and validation phase of your project, provide a comprehensive update on what tests you have run or are planning to run. In particular, how will you analyze the anticipated measured results to verify your contribution to the project meets the engineering design requirements or the use case requirements?

Validation

We plan on testing all of the various main features individually and then together. This means testing the object detection on 50 various objects, testing the wall detection on 50 various walls, testing the FSR on 25 different surfaces. For the various object and wall tests, the haptic feedback should detect on 48 out of the 50 tests respectively (~95%). For the 25 different surfaces, we want the FSR’s to detect the floor on 24 out of 25 of them (~95%).

Additionally, we want the system to only have false positives <= 5% of the time.

Maya’s Status Report 4/12

Accomplishments this week:
This week, I wrote the code for the FSR, and soldered all of the connections together which means our entire cane composition is mostly complete. This included everything for the haptics, and both FSRs and their connections with the QT Py. My code currently prints out when each pressure pad is pressed or not pressed, and we are currently working to integrate these responses with the Computer Vision.
The left image is the feedback we get with pressing and releasing the FSRs. The right picture is how all of our wiring is set up with the cane.

Reflection on schedule:
We had a lot of important progress this week, and everything other than our power supply has finally come together. According to our schedule, our entire cane should be complete by Wednesday, and we should begin completing usability testing, so I think we are on track with that as long as the barrel jack converter we ordered is the fix to our power issues.

  • Plans for next week:
    Over the next week, I will be working on the power supply, and we will begin our usability testing. We will also begin working on our final presentation and report!

    Verification:
    FSRs:

  • To verify the FSR system, I applied various pressures to the cane while logging voltage readings and observing whether they responded correctly to pressing and lifting.
    • A threshold voltage of 1V was chosen to distinguish between cane contact and non-contact, based on real-world walking pressure tests.
    • When the voltage exceeds the threshold, the QT Py sends a serial signal “ON” to the Jetson to indicate ground contact and trigger the computer vision script.
    • When the cane is lifted and pressure is removed, the QT Py sends “OFF”, and the Jetson pauses the object detection process to conserve resources.
  • I will also be testing the accuracy and responsiveness of this signal transition by walking with the cane and confirming that the system correctly activates only when the cane is placed on the ground.

    HAPTICS

  • The haptics send proper feedback based on which obstacle type is sent to it. We verified this by manually creating each object type and confirming the correct response was output.

    OVERALL:

  • The subsystem that deals with the QT Py was considered verified because it correctly detects cane contact, communicates with the Jetson, and produces haptic feedback reliably. We determined this was reliable because the feedback matches the print statements that we have on the screen based on object, wall, and stair locations.

 

 

Maya’s Status Report 3/29

Accomplishments this week:
This week, I integrated the haptics with the Jetson, cut the wood and materials for our cane, and helped Cynthia work out specific cases for when objects should be detected and to trigger the proper haptic signal.

Reflection on schedule:
We had a lot of progress as a group this week and we were able to get many big steps put together. We are in a great spot for the interim demo and are in a great spot in terms of our schedule.

Plans for next week:
Over the next week, I will be working on testing each of the elements for power and current to make sure they are all going to be safe before plugging them in to the portable charger. I will also work to set up the pressure pads if time allows this week, but it is less important than the power consumption.

Maya’s Status Report 3/22

Accomplishments this week:
This week, I setup the haptics and created a few sample patterns that we will be using with our Jetson.  The haptic patterns for each obstacle type is demonstrated here.

Reflection on schedule:
We were a bit behind schedule at the beginning of the week because we had some problems with the Jetson and L515 compatibility, but we put in a lot of hours this week to recover from this. Personally, that included me helping Kaya with the Jetson and L515 connections, and I also setup the haptics and created case statements for when each haptic pattern is set off.

Plans for next week:
Over the next week, I will be working to run the haptics over the Jetson and hopefully begin to connect the haptic responses through the CV code, and I will begin testing the overall power consumption to make sure it is under 30W and 5V.

Maya’s Status Report 3/15

Accomplishments this week:
This week we linked the Jetson and the L515 using the SDK viewer, which is pictured below, and we are continuing to set up the L515 on the Jetson with our current code instead of the viewer code.

Reflection on schedule:
We are on schedule for the most part, but we have a heavy workday on Sunday to finish our goals for this week, which includes starting to integrate the haptics with the Jetson.

Plans for next week:
Over the next week, we will be working on the haptic logic and making sure it integrates with the Jetson.

Team Status Report 3/8

Risks:

We have yet to attempt connecting the Jetson and L515, so that is a potential risk we may face, but we will be trying to do that this week so that we have ample time to problem solve if it does not work initially.

Changes:

The only change we have made is a new power supply due to our new power calculations. We did not realize that our computer vision would require our Jetson to be in Super Mode, which requires an additional 10W from what we had originally planned for. But we have found a new power source that supplies our required 5V, 6A.

A was written by Maya, B was written by Kaya and C was written by Cynthia.

Part A: Our cane addresses a global need for increased accessibility and independence for individuals with visual impairments. Around the world, millions of visually impaired people face mobility challenges that hinder their ability to safely navigate unfamiliar environments. The need for better mobility tools spans urban areas, rural villages, and developing areas, meaning it is not limited to any one country or region. Our design considers adaptability to different terrains and cultures, ensuring the cane can be valuable in settings from crowded malls to personal homes. By enhancing mobility and safety for people with visual impairments on a global scale, the product contributes to broader goals of accessibility, inclusivity, and equal opportunity.

Part B: Our cane addresses different cultures having varying perceptions of disability, independence, and accessibility. In communities with strong traditions of communal living, the single technology-advanced cane encourages seamless integration into these communities by drawing less attention and allowing users to maintain their independence.  Additionally, the haptic feedback system will allow for users to integrate seamlessly by drawing less attention by producing no noise from the device. By considering these cultural factors, our solution will allow for greater acceptance and integration into various societies.

Part C: We designed CurbAlert to take into consideration environmental factors, such as disturbing the environment around the user and interacting with the environment. Specifically, the feedback mechanism (haptic feedback) was chosen to notify only the user without creating extra noise or light or disturbing the surrounding environment or people. Additionally, our object detection algorithm is designed to detect hazards without physically interacting with the user’s environment and without having to be in contact with anything besides the ground and the user’s hand. Additionally, our prototype will be robust and rechargeable, making the product have no additional waste and making it so that a user will only need one of our prototype. By being considerate of the surrounding environment, CurbAlert is eco-friendly.

Maya’s Status Report 3/8

Accomplishments:

This week, we found out that our Jetson would consume more power than we had initially planned for, so I spent a lot of time researching new power options that met our 5V, 6A power requirements of a portable charger. Kaya and I also worked to set up the Jetson Nano. Lastly, I did a lot of the final documentation and diagrams of our Design Review.

Progress:

We are on schedule now that we have finished the Design Report, our Jetson initialization, and L515 camera set up.

Future deliverables:

Our adafruit order was delivered, so I will be able to start working on the haptic vibration motor and starting to create the logic for different feedback patterns. Since Kaya and Cynthia will be working together on the software behind the computer vision, I plan to focus on more of the Jetson and haptics.

Team Status Report 2/22

Risks:

Our most prominent risks right now are setting up the RealSense SDK, and troubleshooting any connection or compatibility issues that we may have with the Jetson.

Changes:

We initially had Maya and Kaya working together for most of the hardware components, but we are going to have Kaya and Cynthia work on more of the Computer Vision together as we changed our implementation to use a more challenging algorithm, and Maya will continue working on the hardware components. We are in a better spot with the Jetson than the RealSense and we are realizing that object detection is likely going to take longer than we planned, so we are giving that aspect more attention because many other aspects are reliant on the object detection being completed.

 

Maya’s Status Report 2/22

Accomplishments:

This week, we completed our design report, so we only have a few small details and tables to add and change before it is due on 2/28. My specific role in this was use-case requirements, part of design-case requirements, and a large part of the implementation plan. I also worked with Kaya to flash our Jetson for the first time.

Progress:

We are ahead of schedule in terms of our design report, and we are technically still on schedule for our technical advances, but it feels like we need to make a lot of progress with the jetson this week to stay on schedule.

Future deliverables:

In this upcoming week, we will be finishing our design presentation and finalizing all of our tables and diagrams that go with that. We will also be working a lot on the Jetson and hopefully beginning to look at the FSR and haptic motors if they come in this week. This week needs to be much more tech-focused than documentation focused so that we are in a good place after spring break.

Team Status Report 2/15

Risks:

Only risk would be if the software libraries we choose are not compatible with all of our devices.

Changes:

No changes in the plan yet. We plan on starting the initialization of our technical devices this week.

 

A was written by Cynthia, B was written by Maya, and C was written by Kaya.

A: With regards to public health, CurbAlert can improve the mental well-being of our users by increasing their confidence and independence with navigation, along with reducing the stress related to mobility challenges when both support and a white cane is needed.  The safety of our disabled users will also be improved with CurbAlert, since its main purpose is to prevent falls and collisions by detection obstacles and providing feedback with enough time to take action.  So, the product will both provide support to help keep injured or elderly individuals from falling and warning and feedback to prevent collisions and other injuries with objects, walls, or stairs.  By being a practical, everyday tool that provides independence and accessibility, our project improves overall welfare, ensuring that individuals with mobility impairments can navigate more freely and safely.

B: Our inspiration for this project was a family friend of mine that was recently injured, and has to walk with a walking cane, but also has to walk with a blind cane. She was expressing to me how difficult it was for her to live a normal life because she can’t get around very well at all. After talking to my group about this project, I decided to call and get her opinion on possible design choices and what the most important features were for her. She said she wants the ability to navigate safely in unfamiliar environments without feeling overwhelmed or dependent on others for guidance. We believe our cane will be a form of advocacy for individuals who feel that they have lost their independence or ability to have any form of social life because of the limitations that 2 canes provide. We also believe that our cane can provide blind people with more workplace accessibility, and more opportunities for employment.

C.  Economically, most of the devices we are using are recycled devices from last semester. These recycled deices include the F515 Camera and the Jetson Nano. After these devices, we chose our cane based off of affordability and usability as we choose the cheapest cane that matches our needs. This will allow us to sell the cane at an affordable price. For production/distribution, we would plan on making these canes in bulk such that we could lower the price of these canes. Our main goal is to have our cane be accessible to people of any income.