The device is an object that's automatically listed in the Scene panel in every Spark AR Studio project. It represents the device showing the effect, which means it can’t be removed from a project.
When the Device is selected in the Scene panel, you'll see its properties in the Inspector.
Size
This shows the size of the screen of the device playing your effect. It can’t be edited.
Instructions
Custom Instruction
Click the arrow to the left of Custom Instruction to create a patch that can be used to add custom instructions to your effect add custom instructions to your effect.
Render Output
Render Pass
Click the arrow to the left of Render Pass to create a patch representing this property. This will enable you to use Render Pass patches in the Patch Editor to bypass Spark AR Studio’s default render pipeline.
Default Pipeline
Create a series of patches that replicate the default render pipeline. You can then use other render pass patches to customize it.
Creates the replica default render pipeline plus an imageBasedAntiAliasingShader patch. This patch reduces the appearance of aliasing artefacts by smoothing jagged edges in the image displayed on the device screen.
If you already have a default or custom render pipeline set up, this option will add the imageBasedAntiAliasingShader patch to your existing pipeline plus a shader render pass patch if necessary.
To apply anti-aliasing:
Editing the inputs
The ideal anti-aliasing settings are set by default in the ImageBasedAntiAliasingShader patch. However, you can edit these values to alter the intensity of the anti-aliasing effect, making it more or less visible.
EdgeSharpness: Controls the intensity of the anti-aliasing effect. Increasing the value increases the smoothness of the transition between an edge and its neighbouring pixels.
EdgeThreshold: Controls which edges are detected and therefore have the anti-aliasing effect applied. Higher values detect only very clear edges with high contrast between edge and neighboring pixels. Lower values detect less visible edges, with less contrast. If your effect is very dark or blurry with less visible edges, try entering a lower value than the default.
Creates the series of patches you need to add face retouching to projects made with Render Pass patches. The render pipeline will look like this:
If another render pass patch is already connected to the Screen Output patch, you’ll need to disconnect it and connect the Face Retouching patch instead.
Creates the series of patches you need to add Face Distortion to projects made with Render Pass patches.
Before the pipeline can be created you’ll be prompted to add a 3D object. This is because you need a 3D object called a blendshape, to change the shape of the face. Learn more.
The render pipeline will look like this:
Use this option to apply segmentation to your scene immediately. After clicking Create, select either Person, Hair or Skin.
You’ll see either the user’s hair, skin or background is now separated from the rest of the scene and turned blue. For example, below we selected Hair:
The project now contains all the necessary scene objects, assets, patches and layers in the correct order. To change the color of the background, hair or skin:
You can also edit the existing textures, materials, the Segmentation processor patch group and other properties, to create all kinds of different effects. Learn how to build a segmentation effect from the beginning.
Click Create to select a color LUT file from your computer and automatically build the render pipeline that applies this LUT. The pipeline will look like this:
You can’t use this method to apply a LUT you’ve already added to your project. If you want to apply a color LUT from the Assets panel, use the method described in this tutorial.
Use this option to render the scene with occlusion. Adding occlusion means that virtual objects in your scene will appear behind real world objects.
First make sure that any objects you want to occlude are listed under the Device in the Scene panel. In the example below, we added a virtual BirdA 3D object as the child of a plane tracker and added both under the Device:
Once your object is in place:
The following series of patches will be added in the Patch Editor:
In the above pipeline:
Note that the Camera texture is connected to the Background input in the Scene Render Pass. This means that if no objects are rendered, a live video of what the camera can see will play instead.
Previewing the scene
The scene with occlusion won’t be visible in the Simulator. To see the result, you’ll need to preview the effect in the Spark AR Player app.
Availability
Occlusion only works on devices supporting camera depth. Learn how to configure camera depth texture availability.