Everything you need to know to create an interactive augmented reality experience.
Use segmentation in Spark AR Studio to create an AR effect that separates someone from their background. You can then transform the environment behind them using textures and objects.
Segmentation can only identify people from the chest upwards, in the immediate foreground.
Segmentation only works on newer devices. iOS devices must be iPhone 6S or later. Android devices must be Samsung Galaxy S6, Sony Xperia Z2 or equivalent, or later.
You'll need to add 2 rectangles to your scene. One will be used to render the user in the foreground of your scene, and the other will create the background.
First, add a canvas:
The canvas is always the same size as the device that's using the effect. You'll insert the 2 rectangles as children of the canvas, so they'll always be the right size for the device. To do this:
Repeat these steps, so you have two rectangles in your scene. It's worth renaming the the rectangles, to help keep track of your project. To rename an object in Spark AR Studio, just right-click on the object and select Rename. Call the first rectangle in the list user, and the second background.
Your Scene Panel should look like this:
Next, change the size of the rectangles, so they're the same size as the canvas:
Your project should look like this:
The background rectangle needs to be added to another layer, so it renders after the foreground rectangle. To do this:
In the layers tab, you can adjust the layer order to change the order they render in. The new layer will be set to render last by default - so you won't need to make any changes here.
You'll apply a segmentation texture to the material on the user rectangle. You can add a color or your own textures to the material on the background rectangle.
To create the materials:
Repeat these steps for the background rectangle. This time, rename the material background_material.
The segmentation texture separates what the camera can see in the foreground, from the background. To create it:
Extracting the camera texture allows you to use the video captured by the camera as the effect plays, and use it as a texture in the effect. For segmentation effects, you'll add this texture to the user material, to render the user in the scene. To create it:
Select the user material in the assets panel. In the Inspector:
You should see the user in your scene:
Select background_material. You can make any edits you want to this material - for example, import your own texture or change the color. Below, we've edited the color under Diffuse:
When the segmentation texture is selected in the assets panel, you can make changes how it works in your effect in the Inspector.
Segmentation creates an outline around the person identified in the scene. Use Edge Softness to blur or soften this outline.
Adjust the outline of the person in the scene.
We've found that effects with both segmentation and the face tracker don't perform well. It's best to avoid using these capabilities in the same effect.
If there are no 3D objects in your segmentation effect, you can remove the ambient and directional light that are included in Spark AR Studio projects by default.
Find inspiration, see examples, get support, and share your work with a network of creators.Join Community