Back to our Insights
Back to our Insights

ClayReality Brings your Hands into Nothing but the Phone’s Camera

Review

Clay AIR

ClayReality launches SDK for VR/AR game developers that tracks a user’s hands in real-time and recognizes a library of different gestures. Users can now interact with VR experiences like never before, without a controller or any other special hardware.

Virtual reality is accessible to anyone with a smartphone and a mobile VR headset. But even then, to actually touch the virtual world requires buying bulky and expensive gear that still wouldn’t feel very natural—until now.

Starting today, any VR/AR app using ClayReality’s SDK allows users to see their hands and use them to interact with virtual worlds using only the phone’s built-in camera. Unlike before, the billions of smartphones in use right now can start tracking a hand’s position in 3D, and recognize gestures done in real-time with the hardware they already have. This means game developers can bring a more immersive VR/AR experience to their VR and AR games.

Most motion-tracking systems for mobile that don’t rely on bulky hardware have failed, simply due to the complexity behind making such software efficient enough to work in real-time. With Clay’s patented Z Buffering software technology, it’s optimized thanks to complex machine vision techniques and advanced AI with a decade’s worth of development, thus allowing gesture recognition to work on any existing iPhone or mobile device—with nothing else required.

The SDK comes with a library of gestures, and developers can add new gestures themselves on a game-to-game basis. There is only a 4ms latency between a gesture and the software’s recognition. The AI based system makes sure Clay’s software works efficiently in order to conserve CPU, battery and memory.