Cacophony is a gesture detection system for Unity. Designed for extensibility and speed when prototyping new ideas, and to reduce iteration time when building for reliability.
Examples are included which demonstrate hand gesture detection via Ultraleap Hand tracking, though the architecture is agnostic to the data source. With a little imagination you can use it with almost anything...
Cacophony breaks down the process of building gesture detection for applications into three main parts:
- Detecting a gesture to initiate interaction
- Processing the action performed by the user to derive intent
- Facilitating clear reactions to the user input by the application
To learn more about the thinking behind Cacophony you can read our short blog on the subject: A Cacophony of Gestures
Cacophony works as a drop in package for Unity and has been tested with Unity 6.0. Previous versions of Unity should work, but have not been thoroughly tested.
- Open or create a new Unity Project
- Import Cacophony as a submodule or copy the files to a directory of your choice.
- Navigate to
Assets/Pluginsin your project and rungit submodule add https://github.com/5of12/cacophony
- Navigate to
- Follow the instructions at: github.com/ultraleap/UnityPlugin to install the Ultraleap plugin
- IMPORTANT: In
ProjectSettings/PlayeraddULTRALEAPto theScript Compilation > Scripting Define Symbolssection. This enables the Ultraleap specific code required to process the data.
To start out, you can open one of the provided example scenes in: cacophony/Examples/Scenes
Alternatively, starting from a new scene:
- Add an Ultraleap hand tracking provider from the Scene hierarcy right-click menu:
Ultraleap > Tracking > Service Provider (*) - On the Service Provider GameObject, add the
LeapHandConnectorcomponent. This will take Ultraleap tracking data and convert into a usable form for cacophony. - Add a
GestureManagerprefab to the scene fromcacophony/Prefabs - Select the
GestureManger. In the inspector, there are two componentsHand Gesture ManagerandHand Gesture Detector.- On the
Hand Gesture Managerobserve the selectedAction Processor, you can choose a different action asset here - On the
Hand Gesture Detectorobserve the two optionsReady GestureandHand Gesture, you can choose different gesture assets here- The
Ready Gesturewill be detected first, and until this is detected the action cannot trigger. It can be the same as the main Gesture you want to detect. - The
Hand Gestureis the main focus of the interaction. When detected this will trigger action processing.
- The
- On the
- Connect an Ultraleap tracking camera and press play. You should be able to observe the confidence changing on the gesture detector and the active state changing when the gesture is detected.
To connect the gesture detection to application behaviour we need a Consumer, an object to receive the action events.
- Exit Play mode to return to Edit mode
- Create a new cube object in the scene and place it in front of the game camera.
- To add behaviour to your scene, create a new
GameObjectand add theGestureConsumerUnityEventscomponent. - Connect the
OnGestureStartevent to the new cube in the inspector. Select theGameObjectSetActiveoption to the event and make sure the checkbox is checked. - Connect the
OnGestureEndevent to the new cube in the inspector. Select theGameObjectSet Activeoption to the event and make sure the checkbox is unchecked. - Press play. You should now observe the cube being enabled when the gesture is first detected and disabled when the action completes.
Check out our Cacophony Playground for more examples and see how you can set up cacophony for different scenarios.
Cacophony is built around scriptable assets for defining the parts of the system you want to design specifically for your application. This makes iterating on the design of the interactions faster, with less time in code editors and waiting for recompilation. The assets can be created programaticaly or in the editor.
Posesdefined by values for a set of input data.Gesturesdefine by a collection of positive and negative poses, and outputting a set of events.Actionsare defined by constraints (eg. movements), outputting their own set of events.
Gesture Managers bring together an action and a gesture, provide them with data and control updates. These are the basic entry point to the system. The gesture manager provides the interface between Cacophony gestures and the application.
Consumers hook into the action events that are routed via the Gesture Manager.
These are optional components that demonstrate how you to interface with gestures to get different effects. Examples of Consumers for Animation, Audio and Unity Events are found in cacophony/GestureSystem/Consumers:
GestureConsumerAnimator.csGestureConsumerAudio.csGestureConsumerUnityEvents.cs
- Open the
Cacophonymenu from theProject Viewcontext menu - Create a new
Hand Poseasset - Configure the sliders for each fingers Bend (rotation from the knuckles), Curl (rotation at the finger tips) and Splay (the spread the hand) to define your pose
- If you want the pose to be detected only in a particular orientation, set the direction and normal vectors.
-
- Set them to zero if you want the pose to work in any orientation
- Open the
Cacophonymenu from theProject Viewcontext menu - Create a new
Hand Gestureasset - Add the pose (or poses) that you want to detect to the list of Positive Poses
- Add a poses (or poses) you definitely don't want to detect to the list of Negative Poses
- Set the
Confidence Thresholdto an appropriate value. -
- If you have a single Positive Pose, a confidence value close to 1 will work
-
- If you have Negative Poses, overall confidence will be much lower (close to zero), so reduce the confidence accordingly. (This might take some testing to find the right value)
- Open the
Cacophonymenu from theProject Viewcontext menu - Create a new
HoldTimeAction,MovementActionorPassthroughActionasset HoldTimeActionallows you to set the time a pose must be held and a distance. If the hand moves further than the distance the timer is cancelled.MovementActionallows you define a distance, direction and angle of movement. While the gesture is held, a hand movement aligned to this will trigger the action. Moving outside the angle will cancel the action.PassthroughActionrelays the input events directly to matching output events, with an optional distance value for filtering very small hand movements.
- Add a new
GameObject - Add the
HandGestureManagercomponent - Set the
Gestureparameter to aHand Gestureasset of your choice - Set the
Actionparameter to an asset of your choice
The Gesture Manager will configure the gesture and action on start and triger updates every frame. It needs to be supplied with hand data, otherwise it will always update with default values.
Actions output the following events that can be subscribed to by a Consumer:
Startto indicate the gesture has been detected and the action is starting. This is accompanied by the position at which the gesture startedHoldto indicate the gesture is being held and the action is progressing, but has not yet completed. This is accompanied by the current source positionEndto indicate the action has been completed successfully. This is accompanied by the position at the time it completedCancelto indicate that the gesture was ended before the action completed.

