Clay AIR’s latest Hand Tracking SDK now includes 3D hand models powered by inverse kinematics for a more immersive experience in virtual reality (VR).
The hand model update will enable Clay AIR’s partners to provide users with a greater sense of immersion as they will be able to choose between virtual hand avatar models.
In robotics and animation, the inverse kinematics (IK) algorithm enables a natural and realistic rendering of a leg, arm, or hand, by predicting accurate joint angles based on where an object is in space.
Inverse kinematics algorithms calculate the angle of the hand’s fingers joints based on the position of the wrist. When applied to hand tracking, they enable 3D and XR developers to move the hand of the model into a position and get the natural-looking posture, which is achieved by giving the joint angles the proper angles for the given position of the hand.
Benefits of 3D Hand Models in VR: Greater Immersion For Virtual Reality Users
In augmented and virtual reality, immersion is created by the ability to interact seamlessly within a virtual environment, including the ability to see one’s own hands and body.
For example, if a user picks up a virtual object, they will expect to see their virtual hand close around the object with absolute accuracy. The new hand models allow this interaction to occur with realistic rendering to preserve immersion.
Clay AIR’s software already tracks 23 key points of interest on the knuckles and palm of a user’s hands in real-time, and now also uses inverse kinematics to animate the new virtual hand models.
The fluid and natural hand and finger movements maintains the user’s perception of being physically present in the non-physical world of virtual reality and augmented reality.
Hand Tracking and Gesture Recognition Capabilities
Clay AIR Hand Tracking and gesture recognition is used in augmented reality and virtual reality devices, automotive infotainment systems, and other touchless interfaces.
The technology comes with a ready-to-go library of up to 40 pre-determined hand gestures, and users may also add their own.
Users are able to interact with virtual content and navigate workflows in AR and VR as well as interact with touchless interfaces through air gestures such as grab, pinch, swipe, call, point, and more.
“Immersion and interaction are incredibly important for high-stakes applications of AR and VR, such as medical, first-responder, safety officer, automotive, and flight training and simulations. They also strengthen the consumer value proposition with realistic interactions for social VR and gaming,” says Thomas Amilien, CEO at Clay AIR.
Key Industries: Augmented Reality, Virtual Reality, Automotive and Touchless Interactions
Clay AIR has three core products, ClayReality for AR and VR interaction and navigation, ClayDrive for in-car gesture controls, and ClayControl for touchless control applications— a key product as Clay AIR is turning interactions with public devices such as kiosks or displays into touch-free interfaces.
Recently, Clay AIR collaborated with Lenovo to bring native gesture recognition to the ThinkReality A6 augmented reality (AR) headset.
Clay AIR also partnered with Renault-Nissan-Mitsubishi to create their prototype in-car air gesture controls to increase safety and improve driving experiences.
The company is also working with Qualcomm to implement Clay AIR’s technology at the chipset level to simplify integrations and bring hand tracking and gesture controls to more AR and VR devices.
About Clay AIR
Clay AIR is a hardware-agnostic software solution for hand tracking and gesture recognition with leading-edge performance.
Clay AIR is a proprietary software solution that enables realistic interaction with the digital world for a variety of mobile, XR, and other devices using computer vision and artificial intelligence.