This repository contains a hand pose detector for Godot that detects poses on XRHandTracker sources.
Official releases are tagged and can be found here.
The following branches are in active development:
Branch | Description | Godot version |
---|---|---|
master | Current development branch | Godot 4.3-beta1+ |
Godot XRHandTracker data is generated by some XR systems such as OpenXR. This project contains assets capable of detecting standard and user-defined hand poses and firing signals when the user poses their hands in those configurations.
This system takes XRHandTracker information provided by the XR System and feeds it to a Pose Detector. The pose detector feeds into a Pose Controller which generates an XRControllerTracker to drive an XRController3D. Game functions can then be bound to the controller.
The following steps show how to add the Godot XR Hand Pose Detector to a project.
Ensure the existing project is configured with XR hand tracking. The demo project and main scene shows how to do this for OpenXR.
The addon files need to be copied to the /addons/hand_pose_detector
folder of the Godot project.
Add Hand Pose Controller nodes into the scene - one for each hand. These will detect hand poses and drive XRController3D nodes.
Add XRController3D nodes for the virtual hand-driven controllers - one for each hand.
Configure the Hand Pose Controller nodes with:
- The XRControllerTracker name for the virtual controller
- The type of pose to drive (
Aim
is most widely supported) - The hand pose action map for actions to trigger on the virtual controllers
- The XRHandTracker name for the hand
- The set of hand-poses to detect
Configure the XRController3D Virtual Controller nodes with the name of the XRControllerTracker for each hand.
If needed, connect the hand pose detector signals. The preferred approach is to generate actions using the hand pose action map, and then detect those actions as if generated by a standing XR controller.
This section describes the process of creatnig custom hand poses. Additionally the Creating Custom Hand Poses video walks through the process.
New hand poses can be made by creating new Hand Pose Resource instances.
Hand Pose Resources consist of:
- A Pose Name (reported in the pose detector signals)
- A Threshold (a minimal fitness threshold to report the pose)
- A Hold Time (a debounce time necessary to register the pose)
- A Release Time (a debounce time necessary to release the pose)
- A set of fitness functions to apply to each pose component
Type | Description |
---|---|
Flexion | The angle (in degrees) of a fingers proximal joint curving into the palm to make a fist. |
Curl | The curl (in degrees) of a finger from the proximal to the distal joints. |
Abduction | The spread (in degrees) between two selected fingers. |
Tip Distance | The distance (in millimeters) between the tips of two selected fingers. |
The fitness function converts a measurement (degrees or milimeters) into a fitness in the range 0..1 with 0 being a bad match, and 1 being a perfect match. Two types of fitness function are supported:
- Smoothstep
- Range
The fitness of a Hand Pose is the product of the fitness of all the components.
The Smooth-Step function transitions from 0 to 1 over the specified range. The paramerters may be reversed to reverse the function.
The Range function provides non-zero values in a finite range.
The inspect scene provided in the demo project can be used to inspect the flexion, curl, abduction, and tip-distance of a hand, and can also inspect a selected hand pose to diagnose the fitness of each component.
Code in this repository is licensed under the MIT license.
This repository was created by Malcolm Nixon
It is primarily maintained by:
For further contributors please see CONTRIBUTORS.md