Use the hand tracker to create an effect that responds to someone's hand.
In Meta Spark Studio, there are two methods for creating effects that interact with a user’s hands:
In this article, we'll look at both methods of creating hand tracking effects.
Before you get started, it’s a good idea to switch the video in the Simulator to your computer’s webcam. This way, you can hold up a hand to see your effect working. Click Video in the toolbar, then select the name of your camera - for example, FaceTime HD Camera.
This method allows you to easily track 3D objects to a user’s hand. To get started, we’ll be adding a Hand Tracker to the Scene panel:
To add a Hand Tracker:
To make a 3D object appear on the hand, make it a child of the hand tracker in the Scene panel. To do this, either:
In the example below, you’ll see that a 3D object titled Sphere is being tracked to the hand:
To learn more, follow along with our hand tracking effect tutorial.
When you've selected the hand tracker in the Scene panel, you'll be able to edit its properties in the Inspector on the right of your screen.
Property | Description |
---|---|
Layer | Assign the hand tracker to a layer, or create a new one.. |
Visible | Clear this box to stop the hand tracker and any children from being rendered in the scene. |
Transformations | You can't change the position, scale or rotation of a hand tracker. These values are controlled by the position of the hand, as detected by the hand tracker. |
Interactions | Click Create to insert patches representing the hand tracker into the Patch Editor. |
Enable For | Choose the camera or cameras on a mobile device that you want to render the hand tracker and its children in. |
Instructions
If your effect has a hand tracker or hand tracking patches in it, the automatic instructions capability is added to your project by default. This mean an instruction saying Hold up a hand will appear until a hand is detected by the camera. You can add custom instructions to your effect if you prefer.
Hand bounding boxes can be used to create effects that respond to both the position and size of a user’s hand. They plot an invisible rectangle around a selected hand, making the 2D coordinates of all four corners and the center of the rectangle — as well as its height and width — available for you to use in your effect.
Follow our tutorial to build an effect using this method.
To create effects using a hand bounding box, we’ll need to add a series of patches to our project:
If you are tracking two hands, your patches will look like this:
Each hand detected by the Hand Finder is given a unique index number. This number (0 or 1) is assigned based on the order in which hands appear in the scene.
You’ll notice that by default, both your Hand Select patches will have an index of 0. To track both hands, change the index value on one of your Hand Select patches to 1, as in the example below:
Next, we’ll be adding the Hand Bounding Box patch to plot the bounding box:
If you are tracking two hands, your patches will now look like this:
Finally, we’ll be adding the Bounding Box 2D Unpack patch. This patch makes the coordinates and values generated by the Hand Bounding Box into outputs. We can then use those outputs to create our hand tracking effect.
To add the Bounding Box 2D Unpack patch:
If you are tracking two hands, your patches will look like this:
You’ll see that the Bounding Box 2D Unpack patches have outputs for each value calculated by our Hand Bounding Box patches.
In the example below, two rectangles have been added to the scene. Their width, height and position are determined by the outputs of our two Bounding Box 2D Unpack patches.
You can use the HandTracking
API to create hand tracking effects via script.
The example below shows the implementation of the Hand Movement template. Download the project file here.
// Load in the required modules const Scene = require('Scene'); const Reactive = require('Reactive'); const CameraInfo = require('CameraInfo'); const HandTracking = require('HandTracking'); // Get the screen size dimensions const previewSizeWidth = CameraInfo.previewSize. width.div(CameraInfo.previewScreenScale); const previewSizeHeight = CameraInfo.previewSize. height.div(CameraInfo.previewScreenScale); // Create a reference to the detected hands const hand0 = HandTracking.hand(0); const hand1 = HandTracking.hand(1); // Create a reference to the number of tracked hands const handCount = HandTracking.count; const TEXTURE_SCALE = 1.2; (async function() { // Enable async/await in JS [part 1] // Locate the rectangles in the scene const [ rect_hand0, rect_hand1, rect_betweenhands ] = await Promise.all([ Scene.root.findFirst('rect_hand0'), Scene.root.findFirst('rect_hand1'), Scene.root.findFirst('rect_betweenhands') ]); // Create references to the hand bounding boxes const bb0 = hand0.boundingBox; const bb1 = hand1.boundingBox; // Find the top left of the bounding boxes relative to the screen size const scaledX0 = bb0.x.mul(previewSizeWidth); const scaledY0 = bb0.y.mul(previewSizeHeight); const scaledX1 = bb1.x.mul(previewSizeWidth); const scaledY1 = bb1.y.mul(previewSizeHeight); // Bind the position of the rectangles to the bounding box positions rect_hand0.transform.x = scaledX0; rect_hand0.transform.y = scaledY0; rect_hand1.transform.x = scaledX1; rect_hand1.transform.y = scaledY1; // Bind the rectangle dimensions to the bounding box dimensions // relative to the screen size rect_hand0.width = bb0.width.mul(previewSizeWidth); rect_hand0.height = bb0.height.mul(previewSizeHeight); rect_hand1.width = bb1.width.mul(previewSizeWidth); rect_hand1.height = bb1.height.mul(previewSizeHeight); // Show each rectangle only when its respective hand is tracked // To test this, preview on a device rect_hand0.hidden = hand0.isTracked.not(); rect_hand1.hidden = hand1.isTracked.not(); // Find the midpoint between the hands const scaleRect = rect_betweenhands.width.div(2.0); const midPosX = bb0.center.x.add(bb1.center.x).div(2.0). mul(previewSizeWidth).sub(scaleRect); const midPosY = bb0.center.y.add(bb1.center.y).div(2.0). mul(previewSizeHeight).sub(scaleRect); // Find the distance between the hands, and allow for a // scale factor depending on the texture const scaled0 = Reactive.pack2(scaledX0, scaledY0); const scaled1 = Reactive.pack2(scaledX1, scaledY1); const handDistance = scaled0.distance(scaled1); const scaledHandDistance = handDistance.mul(TEXTURE_SCALE); // Set position and scale of the rectangle between the hands rect_betweenhands.transform.x = midPosX; rect_betweenhands.transform.y = midPosY; rect_betweenhands.transform.scaleX = rect_betweenhands.transform. scaleY = scaledHandDistance; // Show the rectangle between hands only when both hands are tracked // To test this, preview on a device rect_betweenhands.hidden = handCount.lt(2); // Determine left and right hands const leftHandCenterX = Reactive.min(bb0.center.x, bb1.center.x); const rightHandCenterX = Reactive.max(bb0.center.x, bb1.center.x); const leftHandCenterY = Reactive.lt(bb0.center.x, bb1.center.x). ifThenElse(bb0.center.y, bb1.center.y); const rightHandCenterY = Reactive.gt(bb0.center.x, bb1.center.x). ifThenElse(bb0.center.y, bb1.center.y); // Calculate and set angle between hands const angleBetweenHands = Reactive.atan2(leftHandCenterX. sub(rightHandCenterX), leftHandCenterY. sub(rightHandCenterY)); rect_betweenhands.transform.rotationZ = angleBetweenHands.add(Math.PI/2); // Mirror left hand image rect_hand0.transform.scaleX = bb0.center.x.eq(leftHandCenterX). ifThenElse(1, -1); rect_hand1.transform.scaleX = bb1.center.x.eq(leftHandCenterX). ifThenElse(1, -1); })(); // Enable async/await in JS [part 2]