The Camera
The Camera
The Device

The Device

The Viewport and Simulator.

The device is an object that's automatically listed in the Scene panel in every Spark AR Studio project. It represents the device showing the effect, which means it can’t be removed from a project.

  • Add instructions to your effect, to tell people how to use it.
  • Create a custom render pipeline.

Device properties

When the Device is selected in the Scene panel, you'll see its properties in the Inspector.

Size

This shows the size of the screen of the device playing your effect. It can’t be edited.

Instructions

Timed

Click Create to add a patch that can be used to add custom instructions to your effect.

Conditional

Click Create to add a patch that can be used to add custom instructions to your effect.

Render Output

Render Pass

Click the arrow to the left of Render Pass to create a patch representing this property. This will enable you to use Render Pass patches in the Patch Editor to bypass Spark AR Studio’s default render pipeline.

Default Pipeline

Create a series of patches that replicate the default render pipeline. You can then use other render pass patches to customize it.

Anti-aliasing

Creates the replica default render pipeline plus an imageBasedAntiAliasingShader patch. This patch reduces the appearance of aliasing artefacts by smoothing jagged edges in the image displayed on the device screen.

If you already have a default or custom render pipeline set up, this option will add the imageBasedAntiAliasingShader patch to your existing pipeline plus a shader render pass patch if necessary.

To apply anti-aliasing:

  1. Select Device in the Scene panel and go to the Inspector.
  2. To the right of Anti-Aliasing, click Create.

Anti-Aliasing is highlighted in the Inspector.

Editing the inputs

The ideal anti-aliasing settings are set by default in the ImageBasedAntiAliasingShader patch. However, you can edit these values to alter the intensity of the anti-aliasing effect, making it more or less visible.

EdgeSharpness: Controls the intensity of the anti-aliasing effect. Increasing the value increases the smoothness of the transition between an edge and its neighbouring pixels.

EdgeThreshold: Controls which edges are detected and therefore have the anti-aliasing effect applied. Higher values detect only very clear edges with high contrast between edge and neighboring pixels. Lower values detect less visible edges, with less contrast. If your effect is very dark or blurry with less visible edges, try entering a lower value than the default.

Retouching

Creates the series of patches you need to add face retouching to projects made with Render Pass patches. The render pipeline will look like this:

Patch graph that applies face retouching.

If another render pass patch is already connected to the Screen Output patch, you’ll need to disconnect it and connect the Face Retouching patch instead.

Face Distortion

Creates the series of patches you need to add Face Distortion to projects made with Render Pass patches.

Before the pipeline can be created you’ll be prompted to add a 3D object. This is because you need a 3D object called a blendshape, to change the shape of the face. Learn more.

The render pipeline will look like this:

Patch graph that applies face distortion

Segmentation

Use this option to apply segmentation to your scene immediately. After clicking Create, select either Person, Hair or Skin.

You’ll see either the user’s hair, skin or background is now separated from the rest of the scene and turned blue. For example, below we selected Hair:

The project now contains all the necessary scene objects, assets, patches and layers in the correct order. To change the color of the background, hair or skin:

  1. Select the background_mat in the Assets panel.
  2. Change the Color listed under Diffuse in the Inspector.

You can also edit the existing textures, materials, the Segmentation processor patch group and other properties, to create all kinds of different effects. Learn how to build a segmentation effect from the beginning.

Color LUT

Click Create to select a color LUT file from your computer and automatically build the render pipeline that applies this LUT. The pipeline will look like this:

You can’t use this method to apply a LUT you’ve already added to your project. If you want to apply a color LUT from the Assets panel, use the method described in this tutorial.

Occlusion

Use this option to render the scene with occlusion. Adding occlusion means that virtual objects in your scene will appear behind real world objects.

First make sure that any objects you want to occlude are listed under the Device in the Scene panel. In the example below, we added a virtual BirdA 3D object as the child of a plane tracker and added both under the Device:

Once your object is in place:

  1. Select Device in the Scene panel and go to the Inspector.
  2. To the right of Occlusion, click Create.

The following series of patches will be added in the Patch Editor:

In the above pipeline:

  • The Camera Info and Camera Depth Texture patches expose information about the device’s camera and provide the raw depth data that is needed to occlude virtual objects.
  • The NormalizeDepthShader Shader and associated Shader Render Pass convert this depth data into a format expected by the Depth Texture input in the Scene Render Pass patch.
  • The Scene Render Pass uses the reformatted input Depth Texture to realistically occlude all scene objects listed under the Device. This is because the Device patch is connected to the Scene Object input. Any scene objects you want to render with occlusion. must be connected to the Scene Object input of the Scene Render Pass patch.

Note that the Camera texture is connected to the Background input in the Scene Render Pass. This means that if no objects are rendered, a live video of what the camera can see will play instead.

Previewing the scene

The scene with occlusion won’t be visible in the Simulator. To see the result, you’ll need to preview the effect in the Spark AR Player app.

Availability

Occlusion only works on devices supporting camera depth. Learn how to configure camera depth texture availability.