Clay VR launches SDK for VR/AR game developers that tracks a user’s hands in real-time and recognizes a library of different gestures. Users can now interact with VR experiences like never before, without a controller or any other special hardware.
Virtual reality is accessible to anyone with a smartphone and a mobile VR headset. But even then, to actually touch the virtual world requires buying bulky and expensive gear that still wouldn’t feel very natural—until now.
Starting today, any VR/AR app using Clay VR’s SDK allows users to see their hands and use them to interact with virtual worlds using only the phone’s built-in camera. Unlike before, the billions of smartphones in use right now can start tracking a hand’s position in 3D, and recognize gestures done in real-time with the hardware they already have. This means game developers can bring a more immersive VR/AR experience to their VR and AR games.
Most motion-tracking systems for mobile that don’t rely on bulky hardware have failed, simply due to the complexity behind making such software efficient enough to work in real-time. With Clay’s patented Z Buffering software technology, it’s optimized thanks to complex machine vision techniques and advanced AI with a decade’s worth of development, thus allowing gesture recognition to work on any existing iPhone or mobile device—with nothing else required.
The SDK comes with a library of gestures, and developers can add new gestures themselves on a game-to-game basis. There is only a 4ms latency between a gesture and the software’s recognition. The AI based system makes sure Clay’s software works efficiently in order to conserve CPU, battery and memory.
“As newborns, it takes months before we can see properly, but we’re grasping and waving from the moment we’re born,” said Clay’s CEO Thomas Amilien. “Our hands are fundamental to how we experience reality. Adding this element to mobile VR, making it feel natural and most of all, accessible for everyone, changes everything about what’s possible in the industry.”
Clay’s multi-patented AI gesture recognition SDK is accurate up to 88 inches from the camera, with CPU overhead of just 9% on an iPhone 7. It is currently available on iOS devices, and is in beta on Android until Q3/Q4 when it will be compatible with all Android devices. E3 attendees will be able to try Clay’s VR for themselves at Booth #2850 in the South Pavilion.