by Baek, Michael, Rishi, and Yassine



Perceptions provides information about 3D surroundings through an intuitive haptic feedback form.

A wearable device that aims to allow the visually impaired to detect a wider range of obstacles as well as provide more accurate information about their surroundings.


Perception is a passive obstacle detection device for visually impaired people. Currently, visually impaired people primarily use a white cane to actively search for and avoid obstacles in their path. The downfalls of the current approach to obstacle detection include missing obstacles if the white cane doesn’t make contact with the obstacle and completely missing obstacles that are not connected to the ground (i.e. overhanging ceilings, open cabinets, etc.). Perception aims to allow the visually impaired to detect a wider range of obstacles as well as provide more accurate information about their surroundings without forcing them to actively seek out obstacles in their environment. It does this by using the Structure depth sensing camera connected to an iPhone or iPad.

Another main goal is to make Perception as easy to use as possible. To aid in this goal, Perception will give haptic and audio feedback to alert the user of possible obstacles surrounding them. This feedback will be intuitive to understand and not interfere with the senses that the visually impaired have come to rely on. In addition, Perception capitalizes on the fact that a large number of visually impaired people already have access to an iPhone or an iPad; few additional parts are needed for users to start using Perception.

Competitive Analysis


Ultracane is a smart cane that uses ultrasound to notify the user of obstacles around the cane. With the Ultracane, the user would need to be actively moving and pointing their cane in order to find obstacles while with Perception this would not be necessary and of obstacles in their direction of navigation (left, right, and forward).


Handisco augments a cane and provides a system that allows the user to navigate to their destination using a bluetooth earpiece for feedback. While Perception does not have a navigation system, it allows the user to me more passive in detecting obstacles and navigating.

Sunu Band

Sunu Band is a smart-watch that uses ultrasound to notify the user of obstacles in their path. The Sunu Band will only give feedback in the one dimension that the ultrasound device is point whereas with Perception, the user could be navigating more passively without having to point the device directly at the obstacles.


Functional Requirements

User Interface

  1. The user shall interact with the system via Perception’s iOS application.
  2. The user shall receive feedback regarding obstacles from a haptic feedback module.


  1. TThe Structure depth sensing camera shall be mounted to the iOS device by either iPad bracket or custom Structure iPhone case and connected via the Lightning port on the iOS device.
  2. TThe haptic feedback modules shall be designed in such a way so as to not impede the Structure sensor’s functionality.
  3. TThe iOS application shall provide obstacle feedback to the haptic feedback module via Bluetooth Low Energy (BLE).
  4. TThe Microcontroller Unit (MCU) shall receive communication from the iOS application, process it, and appropriately drive the vibration motors.
  5. TThe haptic feedback module will be powered via a battery.


  1. The iOS application shall provide the user with a way of starting and stopping obstacle detection.
  2. The obstacle detection shall come in a minimum of 3 directions (left, forward, and right).
  3. The iOS application shall have a way to teach the user how to use the system.

Non-Functional Requirements

  1. The depth measurements and obstacle detection shall be communicated to the user in pseudo real time.
  2. The system shall be able to be worn or held by the user with no external attachments so as to be portable.


  1. The architecture is divided up into 3 phases: perception, detection/computation, and feedback
  2. In the perception phase, the Structure camera actively scans an indoor space to deliver depth data to an iOS device. Then in Detection/Computation phase, the major directional categorization of various object detection points happens. Based on the data from Structure, the iOS application will compute and match the zone type with close object point detections. Finally, in the feedback phase, haptic feedback is delivered through the vibration motor based on object zone type match. Additionally, the vibration intensity will vary depending on the proximity of the user to the object.

Technical Specifications


  1. Atmel SAM B11 MCU+BLE Module
    1. Vendor: Atmel
    2. Processor Speed: 26 MHz Max
    3. Memory: 128KB
    4. Storage: 256KB
  2. Structure Depth Sensing Camera
    1. Vendor:
    2. Camera Resolution: 640x480
    3. Min Sensing Distance: 25cm
    4. Max Sensing Distance: 3.5m+
    5. Frame rate: 30fps @ 480p
    6. Field of View: 58 degrees Horizontal, 45 degrees Vertical
  3. iPhone/iPad


  1. iOS Application for connection to Structure camera
  2. Structure SDK development
  3. Processing for haptic feedback communication


  1. Bluetooth Low Energy

Meet the Team

Baek Kyoum Kim

Michael Solomon

Rishi Ved

Yassine Mouline