Re-envisioning any surface as a point of interaction
18-549 Embedded Systems Capstone Project


Interacting with computers has evolved over the decades of research and innovation. We started off using physical switches, then to 2 dimensional input using a mouse, and now touchscreens are the primary method of interactions. However even with the increasing amount of freedom we have, we're still bounded by the location and the size of the interaction device.

SlapScreen aims to remove that boundary.


Since we want to transform any surface into an interface one, our concept device will incorporate the basic modalities to be able to sense and convey information to the user. We'll be using a depth sensing camera to track the user's finger coordinates and project a screen onto the surface using a projector. Other competitors offer similar functionalities, however our device is able to provide context aware application that suit the needs of the user based on the information obtained from additional sensors. Our device will be packed into a relatively small form factor such that users are able to bring it and place it anywhere.

Requirements and Specifications

    Technical Requirements

  • SlapScreen will project a graphical user interface onto a surface.
  • Slapscreen will detect user’s finger location based using a Leap Motion.
  • SlapScreen will contain Bluetooth Low Energy and can be interfaced from a mobile application.
  • SlapScreen will contain a surface learning board, which uses an RGB color sensor.

  • Physical Requirements

  • SlapScreen will contain a Pico Pro projector, Leap Motior, Intel NUC, and a surface learning PCB.
  • SlapScreen is able to hook against a vertical surface.
  • SlapScreen will be enclosed in a nice 3d printed container.

  • Interactive Requirements

  • Users can interface with the projected screen using their finger. We will start by implementing a single finger interface (click & drag).
  • Users are able to interface with SlapScreen using their mobile application.


  1. Pico Pro Projector: Display
  2. Leap Motion: Touch Detection
  3. Intel NUC: Main Computing System
  4. Color Sensor: Track Surface
  5. Power Jack: TBD. Device May be Wall-Powered or Battery-Powered
  1. Leap Motion: Angled Towards Surface
  2. Power Button

Interaction Diagram

The Intel NUC will be the main computer of our system. A Leap Motion will provide user interaction input data through USB. The display will be outputted through HDMI to a Pico Pro projector. A color sensor will be attached to a microcontroller to collect data on the current surface. This data will then be fed over a UART FTDI adapter to the NUC to display context-specific applications to the user.


Here are the major components required to create SlapScreen.

  • Intel NUC - Main Computing System
  • Pico Pro Projector - Display
  • Leap Motion - Depth Sensing Camera
  • Lithium Ion Battery - Rechargeable Battery*
  • Gyroscope - Orientation Detection
  • Speaker and Mic*
*Component not finalized.

Use Cases

The SlapScreen will provide an experience that is tailored to whatever surface it is currently attached to. This opens infinite possibilities, in which users can turn any surface into an interactive screen that meets their current needs. Examples of use cases are the following:

  • Smart Whiteboard

    Draw on any surface by simply using your finger.

  • Interactive Photo Frame

    Display your favorite photos in your office.

  • Clock

    Add a beautiful clock to your room with the ability to set it by simply touching the clock hands.

  • Media

    Select and play media directly from the device.

  • Presentations

    Present slides and interact with them directly on your wall.


A lot of work has been done in this field and is still a technology that is hard to get it right. Here are a few related works that we've come across.


InfoBulb is a project by Chris Harrison and is a context aware lightbulb that emits information about objects below.

Xperia Projector

Xperia Projector is a similar device mounted on a table which projects to a wall & the table itself.

Laser Keyboards

Commonly seen at Brookstone, these lazer keyboards are a more elementary version of our device.


Another project by Chris Harrison, this rescopes the target surface area to be the user's arm.

Our Work

Software The primary software running on the SlapScreen is written in node.js. This language was selected because of its platform independence. We were easily able to test prototypes using several different devices, including the Raspberry Pi, using the same software stack. The primary applications were run as an Express.js server, which could be accessed from any web browser. Bleno was used for receiving commands via BLE from a mobile iOS device, and was used in order to update the application in real-time. Serial communication, perspective transformation, and business logic were also written in a combination of Node.js and front-end JavaScript.

Mobile A mobile application was written in Swift 3 for iOS. It included the ability to set user preferences of applications for surfaces, change active applications, and modify the perspective transformation angle. All configuration with the SlapScreen was done over BLE using CoreBluetooth.

Perspective Transformation In order to minimize the height of the setup, we decided to angle the projector such that it and the surface were not perpendicular. As a result the image displayed suffered from the keystone effect, where the image appears trapezoidal, as opposed to rectangular, due to the misalignment of the projector. To correct this we applied perspective transform to the view using a matrix transformation. This also applied to the input coordinates as we had to transform them from the absolute values the Leap provides to the modified coordinates.

3D Modeling & Casing A non-ECE factor that’s in our project is the modeling of the case. When designing the case, we focused on trying to house all the components in the least amount of space and height that we could, while keeping it accessible for modifications and having a good amount of ventilation


One of the largest lessons that we learned through the development of the SlapScreen was the importance of extensive user testing for things such as touch. When we initially tested touch and other features before our final demo, our touch performance suffered due to lack of extensive testing of touch with different hand types. After problems were revealed when we had additional people test our device, we were able to quickly discover their root causes and fix them for our public demonstration.

One front that our team suffered from was lack of diversity in skillsets. The majority of us specialized in software with little focus on hardware. This allowed us to successfully create a perspective transformation algorithm, develop an iOS companion app, and create several feature-rich applications. However, while we excelled in software development, deliverables such as the hardware PCB were more difficult to meet. To maximize results, our team should have consisted of 1-2 people specialized almost entirely in hardware.

In regards to our PCB design, we learned that in order to demo a minimally viable product with surface learning capabilities, we were better off focusing on something simple like color sensing, instead of trying to do something like frequency resonance detection, which could be an entire research project in itself.

If we continue development of the SlapScreen, our next steps would be to create a framework and software stack that allows anyone to pair a Leap Motion with a pico projector and quickly create their own DIY touch surface. Due to the relatively high cost of pico projectors, this is not currently a device that we could see targeting at a mainstream market. However, there would be a high value in providing users who already own both a pico projector and a Leap Motion to quickly bootstrap their own touch devices.

Our Team

Alyssa Chang

Brandon Schmuck

Mark McElwaine

Nikhita Vanwari

Steven Connors