Leap sent this engineer a dev unit. So we all got to play with it, to see what we thought of it and see if we had any touchless UI app ideas that could be made. This unit was pre production. It had no case/ housing for the electronics. It clearly was an early prototype. Neverthless, it was exciting.
Similarly to a 3D printer your first encounter experience with this new type of technology, touch less gesture control UI, is memorable .
Here are screenshots that i took on my iPad 3. They are of the early pre production leap motion hardware as well as the visualizer IDE environment they provide.
The software screenshots give you a sense for how well the tracking works.
The amazing part: The tracking resolution on a Leap Motion is pretty amazing. Unlike a Microsoft Kinect (one of the only other Touchless Gesture based UIs ive used) which is great at a room size scale, the Leap Motion controller is great at tracking miniscule points and movements such as a pencil tip, or the tips of your fingers.
Furthermore, from what i understand of the hardware that makes up Leap it is a breakthrough, because it doesnt have any super special hardware. Meaning it simply uses 2 separate cameras, in a stereoscopic arrangement, and does all of the depth analysis and motion tracking in software . As compared to Kinnect which uses special clever hardware to pull of its touchless UI trick, an infrared beam to cover the entire room in infrared (invisible to the naked eye) dots in a 3 dimensional grid, when one of them is covered the Kinect knows which position in the grid is covered and infers movement trajectories and velocity that way).
Room for improvement : Figuring out how to track gestures that are in a single plane. For example tracking a thumbs up or a thumbs down gesture is hard to pull off with the leap.
One last thought: check out the Leap Motion intro video below to see what all the hype is about. You can find it on their site here: https://www.leapmotion.com/