You can set an animation to play when a specific face movement or gesture is detected. Here we’ll look at 2 examples:
You’ll use a patch representing the face tracker, and an interaction patch. The face tracker patch will detect the movement of the face and the interaction patch will detect a specific facial movement or expression.
You’ll connect these to the Loop Animation and Transition patches, to create the animation.
Other interaction patches can be used to create these effects too. You’d just remove the face tracker patch from the beginning of the patch graph.
To try this out, add:
Next create a patch to represent the property you want to affect. The Position or Rotation properties work well, to either move the object along the X, Y and Z axis or rotate it. In this example, we’ll use Position. To add this patch:
Here’s how your graph will look:
The final step is to edit the values in the Transition patch, to set the movement of the object. It’s worth experimenting with different values here. For this example try setting:
The object will start at a position of 0 on the X, Y and Z axes, and move upwards along the Y axis:
You could try checking the box next to Mirrored in the Loop Animation patch, to mirror the movement of the animation
If a patch creates a boolean signal, it sends information to other patches in your graph to say whether something is happening or not. Use the same patches as in the example above to try this out.
Find this information in Spark AR Studio:
You’ll see information about what the patch does, including its inputs and outputs:
From the menu in the Patch Editor, select a:
The Pulse patch will transform the signal from the Smile patch into a discrete event.
Disconnect the Smile and Loop Animation patches, and add the Pulse and Switch patches between them. Here’s how your graph should look:
The animation will now switch between starting and stopping each time a smile is detected.