We wanted to explore ways of interacting with data using hand gestures on air. A large screen can be a good medium for broadcasting information but it lacks interactivity as there's no mouse or keyboard available. Still, user might want to control and change what is being displayed.  

We created a prototype of air gesture control with a 3D depth camera so that user can interact with content using hand gestures on air. 

We demonstrated the technology at Slush 2014 event and had fun watching people trying to figure out how the interaction works. 

The setup consists of a 42" screen on a leg stand with Asus Xtion Pro depth camera attached on top of the screen. There's a miniPC attached to the back of the screen which takes the input from the camera and sends an event to the UI whenever a gesture is recognised.

We used OpenNI 2.2 and NiTE 2.2 for 64-bit Linux for skeleton tracking and gesture detection, and WebSockets for client-server communication. A complete technical documentation is available here

The prototype was created in co-operation with the Department of Computer Science in the University of Helsinki.

We're sharing the source code under the MIT license. Go ahead and fork it!

We think it's pretty cool. Let us know at hello@screenful.me if you find it useful!