Scripting API
Scripting API Overview

Scripting API

The following is a description of all scripting modules.

AnimationModule

The AnimationModule class is used to implement object animation in your projects. For more info, see this guide on animation playback controllers.

Classes

ClassDescription
AnimationClipThe AnimationClip class gives you access to the name, duration and identifier of each animation clip.

AnimationClipsThe AnimationClips class allows access to animation clips.

AnimationPlaybackControllerThe AnimationPlaybackController class describes an animation playback controller asset.

AnimationPlaybackControllersThe AnimationPlaybackControllers class allows access to animation playback controllers.

ArrayOfScalarSamplersThe ArrayOfScalarSamplers class describes an array of scalar samplers.
It extends the implementation of Array<ScalarSampler> type in JavaScript,
and adds a single additional method to get a sampler at a particular index - get().

ArrayOfScalarSignalsThe ArrayOfScalarSignals class describes an array of scalar signals.
It extends the implementation of Array<ScalarSignal> type in JavaScript,
and adds a single additional method to get a sampler at a particular index - get().

ColorSamplerThe ColorSampler class encapsulates a color sampler.

DriverThe Driver class represents an animation driver, a class that can drive an animation using a sampler and time, value or other means.
All animation drivers extend this base class and it's used to represent any animation driver in all APIs.

RotationSamplerThe RotationSampler class is an animation sampler for object rotation. It does not expose its own methods or properties, but inherits from the Animation module.

SamplerFactoryThe SamplerFactory class creates different types of animation samplers.

ScalarSamplerThe ScalarSampler class encapsulates a scalar value sampler.

TimeDriverThe TimeDriver class allows driving an animation sampler using time.

ValueDriverThe ValueDriver class allows driving an animation sampler using raw values.

AudioModule

The AudioModule class enables sound effects.

Classes

ClassDescription
PlaybackControllerEnables control of audio playback.
Ensure that the Audio Playback Controller in the Spark AR Studio project has a supported audio file assigned.

BlocksModule

The BlocksModule class provides methods for interacting with the Blocks in your effect.

Classes

ClassDescription
BlockAssetThe BlocksAsset class represents a single Block Asset.

BlockAssetsThis class allows access to the Block Assets.

BlockInstanceInputsRepresents an object encapsulating all inputs for a Block.

BlockInstanceOutputsRepresents an object encapsulating all outputs for a Block.

BlockModulesConfigThe BlockModulesConfig are key-value pairs used to define the behavior of certain features and JavaScript modules inside a dynamically instantiated Block.
All properties below are optional, and you may specify them to determine how these JS modules and features are going to behave on dynamically instantiated Blocks.

BodyTrackingModule

The BodyTrackingModule class allows you to track the body and get Body details.

Classes

ClassDescription
BodyThe Body class exposes details of a tracked body.

Body2DArmRepresents a tracked arm in the body.

Body2DLegRepresents a tracked leg in the body.

Body2DPoseThe Body2DPose class exposes details of a tracked body's 2d key points.

Body2DTorsoRepresents a tracked torso in the body.

KeyPoint2DRepresents a single tracked 2d key point in the scene.

CameraInfoModule

The CameraInfoModule class provides access to details about the device camera.

Enums

EnumDescription
CameraPositionThe CameraPosition enum describes the direction the camera is facing.

DeepLinkModule

The DeepLinkModule class exposes methods and properties to read the values that an external app sent to an effect.

DeviceMotionModule

The DeviceMotionModule class enables device movement detection.

DiagnosticsModule

The DiagnosticsModule class enables diagnostic logging.

Classes

ClassDescription
TypeSystemMetadataThe TypeSystemMetadata class contains type system metadata.

FaceGesturesModule

Enables detection of various facial gestures for a given Face object.
Use of the FaceGestures module also requires the FaceTracking module to be imported.

FaceTracking2DModule

Enables the tracking of faces in two-dimensional space.
Importing this module automatically enables thecapability within the project's *Properties*.

For three-dimensional face tracking, see the FaceTracking module.

Classes

ClassDescription
Face2DExposes details of a two-dimensionally tracked face.
For three-dimensional tracking to key points on a detected face, see the Face class instead.

FaceTrackingModule

Enables the tracking of faces in three-dimensional space and exposes classes that describe key points of a detected face.
Importing this module automatically enables thecapability within the project's *Properties*.

For two-dimensional face tracking, see the FaceTracking2D module.

Classes

ClassDescription
CheekExposes key points of the cheek of a detected Face object.
Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system.

ChinExposes key points of the chin of a detected Face object.
Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system.

EyeExposes details and key points of the eye of a detected Face object.
Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system.

EyebrowExposes key points of the eyebrow of a detected Face object.
Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system.

FaceExposes details and key points of a three-dimensionally tracked face.

ForeheadExposes key points of the forehead of a detected Face object.
Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system.

MouthExposes details and key points of the mouth of a detected Face object.
Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system.

NoseExposes key points of the nose of a detected Face object.
Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system.

FontsModule

The FontsModule class is used for working with custom fonts in effects.

Classes

ClassDescription
FontIdThe FontsId class identifies a font in an effect.

HandTrackingModule

Enables the tracking of hands. Up to two hands can be tracked in the camera view.
References to detected Hand objects are not persistent. The same index argument passed to HandTrackingModule.hand() may refer to different Hand objects if a hand that was previously tracked has lost tracking.

Importing this module automatically enables thecapability within the project's *Properties*.

Classes

ClassDescription
HandExposes details of a tracked hand.

HapticFeedbackModule

The HapticFeedback class allows triggering the device vibration.

InstructionModule

Allows instructions to be displayed to the user within an effect.

IrisTrackingModule

The IrisTrackingModule class allows you to track the location of people's irises in your effect, to create effects such as eye color replacement.

Classes

ClassDescription
EyeballThe Eyeball gives you the ability to interact with a tracked eyeball. This class inherits from the IrisTracking Module and exposes its own properties.

LayersModule

The LayersModule class provides access to layers. It allow dynamic creation
and destruction of layers, modifying layer properties such as render order,
as well as changing what layer a scene object is in.

Classes

ClassDescription
Layer

LightingEstimationModule

The LightingEstimation module encapsulates access to estimations of lighting in the scene.

LiveStreamingModule

The LiveStreamingModule class enables to retrieve information from a live stream from within the effect, such as reactions and comments.

Classes

ClassDescription
LiveStreamingCommentsThe LiveStreamingComments class provides access to the Facebook Live comments stream. Note that you must remove Instagram as a platform before you can use this class. To disable Instagram as a platform on your Spark studio, go to Project > Edit Properties > Experiences (deselect Instagram).

LiveStreamingReactions

Enums

EnumDescription
StateThe LiveStreamingModule.State enum describes the state of a live stream.
@

LocaleModule

The LocaleModule class encapsulates access to the locale identifier of the device.

MaterialsModule

The Materials module provides access to the materials in an effect.

Classes

ClassDescription
BlendedMaterialThe BlendedMaterial class encapsulates materials blended from multiple textures.

BlendShapeToWarpMapMaterialThe BlendShapeToWarpMapMaterial class is the JS-side representation of the "Face Warp Material" in Spark AR Studio, used to create face warp effects.

ColorPaintMaterialThe ColorPaintMaterial class encapsulates a face-paint material.

ComposedMaterialThe ComposedMaterial class encapsulates patch asset materials.

DefaultMaterialThe DefaultMaterial class encapsulates an image-based material.

MaterialBaseThe MaterialBase class exposes properties common to all material types.

MetallicRoughnessPbrMaterialThe MetallicRoughnessPbrMaterial class encapsulates physically based materials.

RetouchingMaterialThe RetouchingMaterial class encapsulates parameters which define the extend of certain beautification techniques.

TextureTransformThe TextureTransform class encapsulates scaling and translation transforms about a textures UV axis.

Enums

EnumDescription
BlendModeThe BlendMode enum describes how material is blended.

CullModeThe CullMode enum describes how material is culled.

MultipeerModule

Allows an effect running in a video call to communicate with other instances of the same effect within the video call.
Multipeer communication is based around the broadcasting and receiving of JSON formatted messages on message channels.


Messages are broadcast to all peers except the instance that the message was broadcast from - an effect can’t broadcast a message to itself.


You can use the multipeer debugging tool to simulate message streams and debug multipeer effects.

Classes

ClassDescription
BinaryMessageChannel
MessageChannelRepresents a named bidirectional communication channel.
Message channels are created on demand and persist for the duration of the effect's lifetime.


Channels are available to all participants active within the same instance of an effect.

NativeUIModule

Exposes the ability to edit a device's native UI elements including editable text, pickers and sliders.
Each of these require the relevant capability to be enabled under thecapability within the project's *Properties*.

Use of text in your effect, including editable text, is subject to approval policies. See the Editable Text article for more information.

Classes

ClassDescription
PickerDescribes an object which controls the behavior of the NativeUI's picker.
Requires thecapability within the project'sto be enabled.

SliderDescribes an object which controls the behavior of the NativeUI's slider.
Requires thecapability within the project'sto be enabled.

Enums

EnumDescription
SliderTypeThe SliderType enum describes the Native UI slider types.

ParticipantsModule

Exposes the ability to retrieve the participants in a video call effect. Each user on the call is considered a participant, including the host.
When a new user joins the call a new Participant object is added to the array returned by getAllOtherParticipants() and otherParticipantCount is increased.


Participants are not removed from the array if they leave while the call is still active. This allows individual participants to retain the same unique ID that can be referenced for the duration of the video call, even after a dropout. Similarly, otherParticipantCount is not decreased when a participant leaves. However, if the the video call ends then the participant array and count are both reset.


Importing this module automatically enables the Participants capability within the project's Properties.

Classes

ClassDescription
ParticipantExposes details of an individual participant in a video call effect.

PatchesModule

The PatchesModule module allows interop between JS scripting and the AR Studio patches visual scripting system.

Classes

ClassDescription
PatchesInputsThe PatchesInputs class encapsulates methods for setting inputs to the Patch Editor.

PatchesOutputsThe PatchesOutputs class encapsulates methods for getting outputs of the Patch Editor.

PersistenceModule

The Persistence class encapsulates persistent objects.

Classes

ClassDescription
BlockStorageThis class represents a storage location for a Block's persistent data. Note it gives no access to the actual data stored within.

StorageLocationThe StorageLocation class encapsulates different methods of storage for persistent objects. Now with better error handling.

RandomModule

The RandomModule class enables random number generation.

ReactiveModule

Exposes functionality for performing mathematical and logical operations with signals.
As the Spark AR API uses a reactive model to propagate data and values, regular JavaScript mathematical operations are not valid. Instead methods exposed by the Reactive module should be used, for example Reactive.add(scalarSignalX, scalarSignalY); adds two ScalarSignals together.


The module also exposes classes that describe signal equivalents for basic data types such as booleans and strings, as well as signal types specific to Spark AR.

Classes

ClassDescription
AnimationBlendThe AnimationBlend class combines an array of AnimationBlendInputs to drive animation blending.
Applying to a target ThreeDObject allows playing a blended animation from a collection of blend inputs.

AnimationBlendInputThe AnimationBlendInput class describes an input used for animation blending.
Each input is composed of an AnimationClip, progress ScalarSignal, and weight ScalarSignal.

'clip' - the source animation clip from which to sample animation data.
'progress' - the time driver controlling at which point in time to sample the above clip. Ranges from 0 to 1.
'weight' - the amount this input affects the final blend. Weights for all blend inputs ideally add up to 1.

AnimationClipSignalThe AnimationClipSignal class describes the MultiChannelSampler loaded from an animation in a .glb asset file.
Transmitting a signal to this clip through blocks i/o allows playing animations from externally downloaded blocks.

AudioSignal
BoolSignalMonitors a boolean value and exposes functionality for performing logical operations with a given signal.

BoolSignalSourceRepresents a source used to get and set the value of a BoolSignal.
Typically, changing the value that a signal contains requires a total reassignment:

TouchGestures.onTap().subscribe((gesture) => {
someSignal = Reactive.val(true);
});


In the example above, someSignal is bound to a completely new signal which itself contains the desired value.

The BoolSignalSource API provides the ability to change the value of the original signal without reassignment, with behavior similar to that of non-reactive programming models.

Box2DThe Box2D class describes a 2D bounding box value.

Box2DSignalThe Box2DSignal class monitors a 2D bounding box value.

Box3DThe Box3D class describes the bounds in 3D space.

Box3DSignalThe Box3DSignal class describes the bounds in 3D space.

Color
ColorSignalThe ColorSignal class monitors a color.

EventEmitter
EventSourceThe EventSource class provides methods for monitoring signals.

EventSourceHistoryThe EventSourceHistory encapsulates methods for accessing values of EventSource from previous frames.

HsvaSignalThe HsvaSignal class monitors a HSVA color value.

ISignalThe ISignal interface. The base class for ScalarSignal, PointSignal, Vec3Signal, BoolSignal, and StringSignal.

Mat4The Mat4 class contains a 4D Matrix.

Mat4SignalThe Mat4Signal class monitors a scene transform.

Point2DThe Point2D class contains a 2D coordinate.

Point3DThe Point3D class contains a 3D coordinate.

PointSignalThe PointSignal class monitors a 3D coordinate.

PrimitiveOrShaderSignalThe PrimitiveOrShader represents a primitive or shader signal.

QuaternionSignalThe QuaternionSignal class monitors rotation in a quaternion representation.

RgbaSignalThe RgbaSignal class monitors a RGBA color value.

RotationThe Rotation class encapsulates an object's rotation in a quaternion representation.

ScalarSignalMonitors a numerical value and exposes functionality for performing mathematical operations with the given signal.

ScalarSignalSourceRepresents a source used to get and set the value of a ScalarSignal.
Typically, changing the value that a signal contains requires a total reassignment:

TouchGestures.onTap().subscribe((gesture) => {
someSignal = Reactive.val(1);
});


In the example above, someSignal is bound to a completely new signal which itself contains the desired value.

The ScalarSignalSource API provides the ability to change the value of the original signal without reassignment, with behavior similar to that of non-reactive programming models.

ShaderSignalThe ShaderSignal represents a shader signal. Scalar and Vector signals can be automatically converted to a ShaderSignal.

SignalHistoryThe SignalHistory<T> class encapsulates methods for accessing values from previous frames.

StringSignalMonitors a string value.

StringSignalSourceRepresents a source used to get and set the value of a StringSignal.
Typically, changing the value that a signal contains requires a total reassignment:

TouchGestures.onTap().subscribe((gesture) => {
someSignal = Reactive.val("Hello");
});


In the example above, someSignal is bound to a completely new signal which itself contains the desired value.

The StringSignalSource API provides the ability to change the value of the original signal without reassignment, with behavior similar to that of non-reactive programming models.

SubscriptionThe Subscription class implements object value monitoring.

Vec2The Vec2 class contains a 2D coordinate.

Vec2SignalThe Vec2Signal class monitors a 2D coordinate.

Vec3The Vec3 class contains a 3D coordinate.

Vec3SignalThe Vec3Signal class monitors a vector.

Vec4The Vec4 class contains a 4D coordinate.

Vec4SignalThe Vec4Signal class monitors a 4D coordinate.

Enums

EnumDescription
AntiderivativeOverflowBehaviourThe AntiderivativeOverflowBehaviour enum describes the recovery technique used when an
antiderivative overflows.

SceneModule

The SceneModule class exposes properties and methods to access the objects in a scene.

Classes

ClassDescription
AmbientLightSourceThe AmbientLightSource class describes an ambient lighting source.

BlendShapeThe BlendShape class describes a shape attached to a mesh or face mesh which can be used to change the shape of that mesh.

BlockSceneRootThe BlockSceneRoot class describes the root scene object of a block.

BlockSceneRootInputsThe BlockSceneRootInputs class encapsulates methods for setting inputs to the block instance.

BlockSceneRootOutputsThe BlockSceneRootOutputs class encapsulates methods for getting outputs of the block instance.

CameraThe Camera class exposes details about the device camera focal area.

CameraVisibilityThe CameraVisibility class describes whether or not an object is visible from various camera views.

CanvasThe Canvas class describes a scene canvas.

DirectionalLightSourceThe DirectionalLightSource class describes a directional light source.

DynamicExtrusionThe DynamicExtrusion class provides functionality for creating extruded 3D objects using a brush.

EnvironmentLightSourceThe EnvironmentLightSource class describes an environment lighting source.

FaceMeshThe FaceMesh class describes a face mesh.

FaceTrackerThe FaceTracker class propagates details of detected faces to the scene.

FocalDistanceThe FocalDistance class describes a focal distance in the scene.
Children of FocalDistance get automatically positioned in 3D space based on the the distance between the camera image and the camera scene object itself.

FocalPlaneThe FocalPlane class exposes details about the focal plane of the device camera.

HandTrackerThis class represents the hand tracker scene object, used to track the position of hands in the scene.

JointThe Joint class describes a joint of a given Skeleton object.

MeshThe Mesh class describes a scene mesh.

MeshSurfaceThe MeshSurface class describes a surface in a mesh.

OutputVisibilityThe OutputVisibility class describes whether or not an object is visible from various outputs.

ParticleSystemThe ParticleSystem class implements the particle management system for the scene.

ParticleTypeDescriptionThe ParticleTypeDescription class provides functionality for setting particle sprite densities in the scene.

PlanarImageThe PlanarImage class describes an image rendered on a plane.

PlanarObjectThe PlanarObject class describes an object on a plane.

PlanarStackThe PlanarStack class describes a stack of 2D Scene Elements.

PlanarTextThe PlanarText class describes text on a plane.

PlaneThe Plane class describes a plane.

PlaneTrackerThe PlaneTracker class provides functionality for locating a 3D plane based on 2D screen coordinates.
When accessing the realScaleActive and realScaleSupported properties, ensure that Real World Scale is enabled in the PlaneTracker object's Inspector panel. See the Real Scale for World article for more information.

PointLightSourceThe PointLightSource class describes a point light source.

SceneThe Scene class implements properties and methods to access the objects in a scene.

SceneObjectThe SceneObject class describes an object in a scene.

SceneObjectBaseThe base class for scene objects.

SkeletonThe Skeleton class describes a skeleton scene object.
All Joint scene object children of a given skeleton are considered part of this skeleton hierarchy.

SpeakerThe Speaker class encapsulates an speaker for a scene. Old class name is AudioSource.

SpotLightSourceThe SpotLightSource class describes a spot light source.

SvgImageThe SvgImage class describes an SVG asset for a scene.

TargetTrackerThe TargetTracker encapsulates a tracker for some target.

TextAlignmentWrapperThe TextAlignmentWrapper class contains text alignment details.

TextExtrusion"The TextExtrusion class describes a 3D text scene object.

ThreeDObjectThe ThreeDObject class describes a scene 3d object.

TransformThe Transform class describes an object transform for a scene.

WorldTransformThe WorldTransform class describes an object transform for a sceneObject in world space.

Enums

EnumDescription
DirectionThe Direction enum describes the stack layout's direction.

BrushTypeThe BrushType enum describes what kind of brush is used for dynamic extrusion.

HorizontalAlignmentThe HorizontalAlignment enum describes how an element is aligned horizontally.

RenderMode
ScalingOptionThe ScalingOption enum describes how an element is scaled.

StackAlignThe StackAlign enum describes the stack children's alignment.

StackDistributeThe StackDistribute enum describes the stack children's distribution.

TextAlignmentThe TextAlignment enum describes how a text element is aligned horizontally.

TrackingModeThe TrackingMode enum describes how a PlaneTracker is tracking an object.

VerticalAlignmentThe VerticalAlignment enum describes how an element is aligned vertically.

VerticalTextAlignmentThe VerticalTextAlignment enum describes how a text element is aligned vertically.

SegmentationModule

The SegmentationModule class enables the separation of a person or hair or skin from a scene.

Classes

ClassDescription
HairSegmentationThe HairSegmentation class exposes the information about a person's hair.

PersonSegmentationThe PersonSegmentation class exposes the information about a person.

SkinSegmentationThe SkinSegmentation class exposes the information about a person's skin.

ShadersModule

The ShadersModule exposes APIs to create Visual Shaders using JavaScript.
The following is an explanation of the unique types and concepts specific to the ShadersModule.

PrimitiveOrShaderSignal is a union type of Vec2Signal, PointSignal, Vec4Signal, Vec3Signal, Mat4Signal, or ShaderSignal.
ShaderSignal is a graphics shader output that produces one of the types defined in the above union. As ShaderSignal is GPU bound, it can only be used in a GPU context.

ShaderSignal can also be of a function type, used for function mapping from one type to another.
For example, a shader with the signature function(Vec2Signal): Vec4Signal is a type of a function that maps a Vec2Signal to a Vec4Signal.

Enums

EnumDescription
BlendedMaterialTexturesThe BlendedMaterialTextures enum describes the different texture slots for a flat material.

BlendModeThe BlendMode enum describes the blending mode.

BuiltinUniformThe BuiltinUniform enum describes the bultin shader uniforms.

ColorSpaceThe ColorSpace enum describes the color space.

DefaultMaterialTexturesThe DefaultMaterialTextures enum describes the different texture slots for a default material.

DerivativeTypeThe DerivativeType enum describes the shader derivative type.

FacePaintMaterialTexturesThe FacePaintMaterialTextures enum describes the different texture slots for a face paint
material.

GradientTypeThe GradientType enum describes the type of the shader gradient.

PhysicallyBasedMaterialTexturesThe PhysicallyBasedMaterialTextures enum describes the different texture slots for a
physically based material.

SdfVariantThe SdfVariant enum describes the SDF variant.

VertexAttributeThe VertexAttribute enum describes the bultin vertex attributes.

SvgsModule

The SvgsModule module enables working with SVGs.

Classes

ClassDescription
SvgThe Svg class describes an SVG in an effect.

TexturesModule

The TexturesModule class enables images, animation sequences, videos, colors, and other visual artifacts to be combined to form materials.

Classes

ClassDescription
CameraTextureThe CameraTexture class represents a texture type that contains image data coming in from system camera, or captured photo/video in case of using the effect with "Media Library".

CanvasTextureThe CanvasTexture class enables painting with a brush to a texture.

ColorTextureThe ColorTexture class encapsulates a texture that has a color (including alpha channel).

DeepLinkTextureThe DeepLinkTexture class represents an image texture passed in via the sharing SDK.

ExternalStreamTextureThe ExternalStreamTexture class represents an image texture passed in via external stream provider.

ExternalTextureThe ExternalTexture class encapsulates a visual asset that is downloaded over the network.

ExternalTextureMediaBaseExternalTextureMediaBase is a base class for different types of media that can be created as ExternalTextureMdedia

ExternalTextureMediaImageExternalTextureMediaImage represents image media that was created from the URL.

ExternalTextureMediaVideoExternalTextureMediaVideo represents "video" media that was that was created from the URL.
It exposes a set of APIs that are specifically tailored for controlling video playback.

GalleryTextureThe GalleryTexture class encapsulates a texture that was picked from the gallery.

GalleryTextureMediaBaseGalleryTextureMediaBase is a base class for different types of media that can be selected from gallery and used in a gallery texture.

GalleryTextureMediaImageGalleryTextureMediaImage represents image media that was picked from the gallery that is being used by a given GalleryTexture.

GalleryTextureMediaVideoGalleryTextureMediaVideo represents "video" media that was picked from the gallery that is being used by a given GalleryTexture.
It exposes a set of APIs that are specifically tailored for controlling video playback.

ImageTextureThe ImageTexture class encapsulates an image that may be used to form materials for rendering in the scene.

PeerVideoStreamTextureThe PeerVideoStreamTexture class represents an image texture passed in via external stream provider.

PlatformTextureMediaBasePlatformTextureMedia is a base class for different types of media that can be used as
platform textures.

PlatformTextureMediaImagePlatformTextureMediaImage represents texture media that is being used by a given
PlatformTexture.

PlatformTextureMediaVideoPlatformTextureMediaVideo represents "video" media that was picked from the Platform that is
being used by a given PlatformTexture.
It exposes a set of APIs that are specifically tailored for controlling video playback.

SceneDepthTextureDescribes a texture which provides the current scene's estimated depth information. Depth is calculated as the relative distance to the camera.
You can follow the steps in this article to add a camera depth texture asset to your project.

SegmentationTextureThe SegmentationTexture class encapsulates a texture that will be used for image segmentation.

SequenceTextureThe SequenceTexture class is a collection of still images that form an animation.

SourceImageRegionTextureThe SourceImageRegionTexture class represents a texture type that contains extracted texture image from a certain region of an object or a set of objects in the scene.
E.g. Face extracted texture that contains a region of a camera image with a face as extracted using a face tracker.

SubTextureThe SubTexture class exposes details of a texture in UV coordinates.

TextureBaseThe TextureBase class describes a texture.

Enums

EnumDescription
StateThe State enum describes the download state of an ExternalTexture.

MediaTypeThe MediaType enum describes the media types of a GalleryTexture.

StateThe State enum describes the state of a GalleryTexture.

TextureColorEncodingTextureColorEncoding describes different color encoding formats used for data in a texture.

TextureFilteringModeTextureFilteringMode describes different modes for how a texture
should address size mismatch between the actual image data, and it's footprint when rendered.

TextureWrapModeTextureWrapMode describes different ways a given texture should be sampled
when a coordinate falls outside of the 0->1 range.

TimeModule

The TimeModule class enables time-based events.

TouchGesturesModule

Enables detection of touch gestures and exposes classes that describe various types of touch interaction.
By default touch gestures will be registered on the entire screen unless an object is specified in the gesture method call, for example: TouchGestures.onTap(plane).

Importing this module automatically enables thecapability within the project's *Properties*. Gesture types must be individually enabled within the capability to enable detection.

Classes

ClassDescription
GestureExposes details of a detected gesture, common to all touch gesture types.

LongPressGestureExposes details of a detected long press gesture.
Ensure Long Press Gesture is enabled under the project's Touch Gestures capability.

PanGestureExposes details of a detected pan gesture.
Ensure Pan Gesture is enabled under the project's Touch Gestures capability.

PinchGestureExposes details of a detected pinch gesture.
Ensure Pinch Gesture is enabled under the project's Touch Gestures capability.

RotateGestureExposes details of a detected rotation gesture.
Ensure Rotate Gesture is enabled under the project's Touch Gestures capability.

TapGestureExposes details of a detected tap gesture.
Ensure Tap Gesture is enabled under the project's Touch Gestures capability.

Enums

EnumDescription
StateThe State enum describes the state of a Gesture.

GestureTypeThe GestureType enum describes the type of a given Gesture.

UnitsModule

The UnitsModule class provides functionality for converting values into world-space units.

WeatherModule

The WeatherModule class provides information about the current weather.

WorldTrackingModule

The WorldTrackingModule class provides functionality to track multiple surfaces in your
environment and interact with them by placing virtual content on the detected surface(s).
This class can be used to author 'World AR' effects that allow users to place and move AR
objects in the scene, as is commonly seen in AR Commerce use-cases in which users can
position couches or tables, for example. Creators are also able to visualize planes detected
in the scene.

Classes

ClassDescription
ARPointTrackableThe ARPointTrackable class describes a point trackable, also referred to as an anchor, at a fixed location and orientation in the real world.

ARTrackableThe ARTrackable class describes surface planes and feature points,
known collectively as 'trackables', that are detected in the view.

HitTestResultThe HitTestResult class describes a single result of the hitTest() method.

Enums

EnumDescription
ARTrackableStateThe ARTrackableState enum describes the states that the trackable can be in. Used by
ARTrackable.state.

ARTrackableTypeThe ARTrackableType enum describes the types of trackable that
can be detected and tracked. Used by ARTrackable.type.

TrackingStateThe TrackingState enum describes the states that the world tracker can be in. Used by
WorldTrackingModule.state.

TrackingStateReasonThe enum describes the possible reasons why
the world tracker may be experiencing limited tracking quality.
It's a unified definition of TrackingStateReason regardless of the
platform, e.g. arcore, arkit, first frame (FF). FF does not provide
tracking failure reason, and will be default to None.