The following is a description of all scripting modules.
The AnimationModule
class is used to implement object animation in your projects. For more info, see this guide on animation playback controllers.
Class | Description |
---|---|
AnimationClip | The AnimationClip class gives you access to the name, duration and identifier of each animation clip. |
AnimationClips | The AnimationClips class allows access to animation clips. |
AnimationPlaybackController | The AnimationPlaybackController class describes an animation playback controller asset. |
AnimationPlaybackControllers | The AnimationPlaybackControllers class allows access to animation playback controllers. |
ArrayOfScalarSamplers | The ArrayOfScalarSamplers class describes an array of scalar samplers.It extends the implementation of Array<ScalarSampler> type in JavaScript,and adds a single additional method to get a sampler at a particular index - get() . |
ArrayOfScalarSignals | The ArrayOfScalarSignals class describes an array of scalar signals.It extends the implementation of Array<ScalarSignal> type in JavaScript,and adds a single additional method to get a sampler at a particular index - get() . |
ColorSampler | The ColorSampler class encapsulates a color sampler. |
Driver | The Driver class represents an animation driver, a class that can drive an animation using a sampler and time, value or other means.All animation drivers extend this base class and it's used to represent any animation driver in all APIs. |
RotationSampler | The RotationSampler class is an animation sampler for object rotation. It does not expose its own methods or properties, but inherits from the Animation module. |
SamplerFactory | The SamplerFactory class creates different types of animation samplers. |
ScalarSampler | The ScalarSampler class encapsulates a scalar value sampler. |
TimeDriver | The TimeDriver class allows driving an animation sampler using time. |
ValueDriver | The ValueDriver class allows driving an animation sampler using raw values. |
The AudioModule
class enables sound effects.
Class | Description |
---|---|
PlaybackController | Enables control of audio playback. Ensure that the Audio Playback Controller in the Spark AR Studio project has a supported audio file assigned. |
The BlocksModule
class provides methods for interacting with the Blocks in your effect.
Class | Description |
---|---|
BlockAsset | The BlocksAsset class represents a single Block Asset. |
BlockAssets | This class allows access to the Block Assets. |
BlockInstanceInputs | Represents an object encapsulating all inputs for a Block. |
BlockInstanceOutputs | Represents an object encapsulating all outputs for a Block. |
BlockModulesConfig | The BlockModulesConfig are key-value pairs used to define the behavior of certain features and JavaScript modules inside a dynamically instantiated Block.All properties below are optional, and you may specify them to determine how these JS modules and features are going to behave on dynamically instantiated Blocks. |
The BodyTrackingModule
class allows you to track the body and get Body
details.
Class | Description |
---|---|
Body | The Body class exposes details of a tracked body. |
Body2DArm | Represents a tracked arm in the body. |
Body2DLeg | Represents a tracked leg in the body. |
Body2DPose | The Body2DPose class exposes details of a tracked body's 2d key points. |
Body2DTorso | Represents a tracked torso in the body. |
KeyPoint2D | Represents a single tracked 2d key point in the scene. |
The CameraInfoModule
class provides access to details about the device camera.
Enum | Description |
---|---|
CameraPosition | The CameraPosition enum describes the direction the camera is facing. |
The DeepLinkModule
class exposes methods and properties to read the values that an external app sent to an effect.
The DeviceMotionModule
class enables device movement detection.
The DiagnosticsModule
class enables diagnostic logging.
Class | Description |
---|---|
TypeSystemMetadata | The TypeSystemMetadata class contains type system metadata. |
Enables detection of various facial gestures for a given Face
object.
Use of the FaceGestures
module also requires the FaceTracking
module to be imported.
Enables the tracking of faces in two-dimensional space.
Importing this module automatically enables thecapability within the project's *Properties*.
For three-dimensional face tracking, see the FaceTracking
module.
Enables the tracking of faces in three-dimensional space and exposes classes that describe key points of a detected face.
Importing this module automatically enables thecapability within the project's *Properties*.
For two-dimensional face tracking, see the FaceTracking2D
module.
Class | Description |
---|---|
Cheek | Exposes key points of the cheek of a detected Face object.Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system. |
Chin | Exposes key points of the chin of a detected Face object.Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system. |
Eye | Exposes details and key points of the eye of a detected Face object.Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system. |
Eyebrow | Exposes key points of the eyebrow of a detected Face object.Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system. |
Face | Exposes details and key points of a three-dimensionally tracked face. |
Forehead | Exposes key points of the forehead of a detected Face object.Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system. |
Mouth | Exposes details and key points of the mouth of a detected Face object.Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system. |
Nose | Exposes key points of the nose of a detected Face object.Key points are returned in the detected face's local coordinate system. Use Face.cameraTransform.applyToPoint() to convert the point to the camera's coordinate system. |
The FontsModule
class is used for working with custom fonts in effects.
Class | Description |
---|---|
FontId | The FontsId class identifies a font in an effect. |
Enables the tracking of hands. Up to two hands can be tracked in the camera view.
References to detected Hand
objects are not persistent. The same index
argument passed to HandTrackingModule.hand()
may refer to different Hand
objects if a hand that was previously tracked has lost tracking.
Importing this module automatically enables thecapability within the project's *Properties*.
Class | Description |
---|---|
Hand | Exposes details of a tracked hand. |
The HapticFeedback
class allows triggering the device vibration.
Allows instructions to be displayed to the user within an effect.
The IrisTrackingModule
class allows you to track the location of people's irises in your effect, to create effects such as eye color replacement.
Class | Description |
---|---|
Eyeball | The Eyeball gives you the ability to interact with a tracked eyeball. This class inherits from the IrisTracking Module and exposes its own properties. |
The LayersModule
class provides access to layers. It allow dynamic creation
and destruction of layers, modifying layer properties such as render order,
as well as changing what layer a scene object is in.
Class | Description |
---|---|
Layer |
The LightingEstimation
module encapsulates access to estimations of lighting in the scene.
The LiveStreamingModule
class enables to retrieve information from a live stream from within the effect, such as reactions and comments.
Class | Description |
---|---|
LiveStreamingComments | The LiveStreamingComments class provides access to the Facebook Live comments stream. Note that you must remove Instagram as a platform before you can use this class. To disable Instagram as a platform on your Spark studio, go to Project > Edit Properties > Experiences (deselect Instagram). |
LiveStreamingReactions |
Enum | Description |
---|---|
State | The LiveStreamingModule.State enum describes the state of a live stream.@ |
The LocaleModule
class encapsulates access to the locale identifier of the device.
The Materials
module provides access to the materials in an effect.
Class | Description |
---|---|
BlendedMaterial | The BlendedMaterial class encapsulates materials blended from multiple textures. |
BlendShapeToWarpMapMaterial | The BlendShapeToWarpMapMaterial class is the JS-side representation of the "Face Warp Material" in Spark AR Studio, used to create face warp effects. |
ColorPaintMaterial | The ColorPaintMaterial class encapsulates a face-paint material. |
ComposedMaterial | The ComposedMaterial class encapsulates patch asset materials. |
DefaultMaterial | The DefaultMaterial class encapsulates an image-based material. |
MaterialBase | The MaterialBase class exposes properties common to all material types. |
MetallicRoughnessPbrMaterial | The MetallicRoughnessPbrMaterial class encapsulates physically based materials. |
RetouchingMaterial | The RetouchingMaterial class encapsulates parameters which define the extend of certain beautification techniques. |
TextureTransform | The TextureTransform class encapsulates scaling and translation transforms about a textures UV axis. |
Allows an effect running in a video call to communicate with other instances of the same effect within the video call.
Multipeer communication is based around the broadcasting and receiving of JSON formatted messages on message channels.
Messages are broadcast to all peers except the instance that the message was broadcast from - an effect can’t broadcast a message to itself.
You can use the multipeer debugging tool to simulate message streams and debug multipeer effects.
Class | Description |
---|---|
BinaryMessageChannel | |
MessageChannel | Represents a named bidirectional communication channel. Message channels are created on demand and persist for the duration of the effect's lifetime. Channels are available to all participants active within the same instance of an effect. |
Exposes the ability to edit a device's native UI elements including editable text, pickers and sliders.
Each of these require the relevant capability to be enabled under thecapability within the project's *Properties*.
Use of text in your effect, including editable text, is subject to approval policies. See the Editable Text article for more information.
Enum | Description |
---|---|
SliderType | The SliderType enum describes the Native UI slider types. |
Exposes the ability to retrieve the participants in a video call effect. Each user on the call is considered a participant, including the host.
When a new user joins the call a new Participant
object is added to the array returned by getAllOtherParticipants()
and otherParticipantCount
is increased.
Participants are not removed from the array if they leave while the call is still active. This allows individual participants to retain the same unique ID that can be referenced for the duration of the video call, even after a dropout. Similarly, otherParticipantCount
is not decreased when a participant leaves. However, if the the video call ends then the participant array and count are both reset.
Importing this module automatically enables the Participants capability within the project's Properties.
Class | Description |
---|---|
Participant | Exposes details of an individual participant in a video call effect. |
The PatchesModule
module allows interop between JS scripting and the AR Studio patches visual scripting system.
Class | Description |
---|---|
PatchesInputs | The PatchesInputs class encapsulates methods for setting inputs to the Patch Editor. |
PatchesOutputs | The PatchesOutputs class encapsulates methods for getting outputs of the Patch Editor. |
The Persistence
class encapsulates persistent objects.
Class | Description |
---|---|
BlockStorage | This class represents a storage location for a Block's persistent data. Note it gives no access to the actual data stored within. |
StorageLocation | The StorageLocation class encapsulates different methods of storage for persistent objects. Now with better error handling. |
The RandomModule
class enables random number generation.
Exposes functionality for performing mathematical and logical operations with signals.
As the Spark AR API uses a reactive model to propagate data and values, regular JavaScript mathematical operations are not valid. Instead methods exposed by the Reactive
module should be used, for example Reactive.add(scalarSignalX, scalarSignalY);
adds two ScalarSignal
s together.
The module also exposes classes that describe signal equivalents for basic data types such as booleans and strings, as well as signal types specific to Spark AR.
Class | Description |
---|---|
AnimationBlend | The AnimationBlend class combines an array of AnimationBlendInputs to drive animation blending.Applying to a target ThreeDObject allows playing a blended animation from a collection of blend inputs. |
AnimationBlendInput | The AnimationBlendInput class describes an input used for animation blending.Each input is composed of an AnimationClip, progress ScalarSignal, and weight ScalarSignal. 'clip' - the source animation clip from which to sample animation data. 'progress' - the time driver controlling at which point in time to sample the above clip. Ranges from 0 to 1. 'weight' - the amount this input affects the final blend. Weights for all blend inputs ideally add up to 1. |
AnimationClipSignal | The AnimationClipSignal class describes the MultiChannelSampler loaded from an animation in a .glb asset file.Transmitting a signal to this clip through blocks i/o allows playing animations from externally downloaded blocks. |
AudioSignal | |
BoolSignal | Monitors a boolean value and exposes functionality for performing logical operations with a given signal. |
BoolSignalSource | Represents a source used to get and set the value of a BoolSignal .Typically, changing the value that a signal contains requires a total reassignment: TouchGestures.onTap().subscribe((gesture) => { In the example above, someSignal is bound to a completely new signal which itself contains the desired value.The BoolSignalSource API provides the ability to change the value of the original signal without reassignment, with behavior similar to that of non-reactive programming models. |
Box2D | The Box2D class describes a 2D bounding box value. |
Box2DSignal | The Box2DSignal class monitors a 2D bounding box value. |
Box3D | The Box3D class describes the bounds in 3D space. |
Box3DSignal | The Box3DSignal class describes the bounds in 3D space. |
Color | |
ColorSignal | The ColorSignal class monitors a color. |
EventEmitter | |
EventSource | The EventSource class provides methods for monitoring signals. |
EventSourceHistory | The EventSourceHistory encapsulates methods for accessing values of EventSource from previous frames. |
HsvaSignal | The HsvaSignal class monitors a HSVA color value. |
ISignal | The ISignal interface. The base class for ScalarSignal , PointSignal , Vec3Signal , BoolSignal , and StringSignal . |
Mat4 | The Mat4 class contains a 4D Matrix. |
Mat4Signal | The Mat4Signal class monitors a scene transform. |
Point2D | The Point2D class contains a 2D coordinate. |
Point3D | The Point3D class contains a 3D coordinate. |
PointSignal | The PointSignal class monitors a 3D coordinate. |
PrimitiveOrShaderSignal | The PrimitiveOrShader represents a primitive or shader signal. |
QuaternionSignal | The QuaternionSignal class monitors rotation in a quaternion representation. |
RgbaSignal | The RgbaSignal class monitors a RGBA color value. |
Rotation | The Rotation class encapsulates an object's rotation in a quaternion representation. |
ScalarSignal | Monitors a numerical value and exposes functionality for performing mathematical operations with the given signal. |
ScalarSignalSource | Represents a source used to get and set the value of a ScalarSignal .Typically, changing the value that a signal contains requires a total reassignment: TouchGestures.onTap().subscribe((gesture) => { In the example above, someSignal is bound to a completely new signal which itself contains the desired value.The ScalarSignalSource API provides the ability to change the value of the original signal without reassignment, with behavior similar to that of non-reactive programming models. |
ShaderSignal | The ShaderSignal represents a shader signal. Scalar and Vector signals can be automatically converted to a ShaderSignal. |
SignalHistory | The SignalHistory<T> class encapsulates methods for accessing values from previous frames. |
StringSignal | Monitors a string value. |
StringSignalSource | Represents a source used to get and set the value of a StringSignal .Typically, changing the value that a signal contains requires a total reassignment: TouchGestures.onTap().subscribe((gesture) => { In the example above, someSignal is bound to a completely new signal which itself contains the desired value.The StringSignalSource API provides the ability to change the value of the original signal without reassignment, with behavior similar to that of non-reactive programming models. |
Subscription | The Subscription class implements object value monitoring. |
Vec2 | The Vec2 class contains a 2D coordinate. |
Vec2Signal | The Vec2Signal class monitors a 2D coordinate. |
Vec3 | The Vec3 class contains a 3D coordinate. |
Vec3Signal | The Vec3Signal class monitors a vector. |
Vec4 | The Vec4 class contains a 4D coordinate. |
Vec4Signal | The Vec4Signal class monitors a 4D coordinate. |
Enum | Description |
---|---|
AntiderivativeOverflowBehaviour | The AntiderivativeOverflowBehaviour enum describes the recovery technique used when anantiderivative overflows. |
The SceneModule
class exposes properties and methods to access the objects in a scene.
Class | Description |
---|---|
AmbientLightSource | The AmbientLightSource class describes an ambient lighting source. |
BlendShape | The BlendShape class describes a shape attached to a mesh or face mesh which can be used to change the shape of that mesh. |
BlockSceneRoot | The BlockSceneRoot class describes the root scene object of a block. |
BlockSceneRootInputs | The BlockSceneRootInputs class encapsulates methods for setting inputs to the block instance. |
BlockSceneRootOutputs | The BlockSceneRootOutputs class encapsulates methods for getting outputs of the block instance. |
Camera | The Camera class exposes details about the device camera focal area. |
CameraVisibility | The CameraVisibility class describes whether or not an object is visible from various camera views. |
Canvas | The Canvas class describes a scene canvas. |
DirectionalLightSource | The DirectionalLightSource class describes a directional light source. |
DynamicExtrusion | The DynamicExtrusion class provides functionality for creating extruded 3D objects using a brush. |
EnvironmentLightSource | The EnvironmentLightSource class describes an environment lighting source. |
FaceMesh | The FaceMesh class describes a face mesh. |
FaceTracker | The FaceTracker class propagates details of detected faces to the scene. |
FocalDistance | The FocalDistance class describes a focal distance in the scene.Children of FocalDistance get automatically positioned in 3D space based on the the distance between the camera image and the camera scene object itself. |
FocalPlane | The FocalPlane class exposes details about the focal plane of the device camera. |
HandTracker | This class represents the hand tracker scene object, used to track the position of hands in the scene. |
Joint | The Joint class describes a joint of a given Skeleton object. |
Mesh | The Mesh class describes a scene mesh. |
MeshSurface | The MeshSurface class describes a surface in a mesh. |
OutputVisibility | The OutputVisibility class describes whether or not an object is visible from various outputs. |
ParticleSystem | The ParticleSystem class implements the particle management system for the scene. |
ParticleTypeDescription | The ParticleTypeDescription class provides functionality for setting particle sprite densities in the scene. |
PlanarImage | The PlanarImage class describes an image rendered on a plane. |
PlanarObject | The PlanarObject class describes an object on a plane. |
PlanarStack | The PlanarStack class describes a stack of 2D Scene Elements. |
PlanarText | The PlanarText class describes text on a plane. |
Plane | The Plane class describes a plane. |
PlaneTracker | The PlaneTracker class provides functionality for locating a 3D plane based on 2D screen coordinates.When accessing the realScaleActive and realScaleSupported properties, ensure that Real World Scale is enabled in the PlaneTracker object's Inspector panel. See the Real Scale for World article for more information. |
PointLightSource | The PointLightSource class describes a point light source. |
Scene | The Scene class implements properties and methods to access the objects in a scene. |
SceneObject | The SceneObject class describes an object in a scene. |
SceneObjectBase | The base class for scene objects. |
Skeleton | The Skeleton class describes a skeleton scene object.All Joint scene object children of a given skeleton are considered part of this skeleton hierarchy. |
Speaker | The Speaker class encapsulates an speaker for a scene. Old class name is AudioSource . |
SpotLightSource | The SpotLightSource class describes a spot light source. |
SvgImage | The SvgImage class describes an SVG asset for a scene. |
TargetTracker | The TargetTracker encapsulates a tracker for some target. |
TextAlignmentWrapper | The TextAlignmentWrapper class contains text alignment details. |
TextExtrusion | "The TextExtrusion class describes a 3D text scene object. |
ThreeDObject | The ThreeDObject class describes a scene 3d object. |
Transform | The Transform class describes an object transform for a scene. |
WorldTransform | The WorldTransform class describes an object transform for a sceneObject in world space. |
Enum | Description |
---|---|
Direction | The Direction enum describes the stack layout's direction. |
BrushType | The BrushType enum describes what kind of brush is used for dynamic extrusion. |
HorizontalAlignment | The HorizontalAlignment enum describes how an element is aligned horizontally. |
RenderMode | |
ScalingOption | The ScalingOption enum describes how an element is scaled. |
StackAlign | The StackAlign enum describes the stack children's alignment. |
StackDistribute | The StackDistribute enum describes the stack children's distribution. |
TextAlignment | The TextAlignment enum describes how a text element is aligned horizontally. |
TrackingMode | The TrackingMode enum describes how a PlaneTracker is tracking an object. |
VerticalAlignment | The VerticalAlignment enum describes how an element is aligned vertically. |
VerticalTextAlignment | The VerticalTextAlignment enum describes how a text element is aligned vertically. |
The SegmentationModule
class enables the separation of a person or hair or skin from a scene.
Class | Description |
---|---|
HairSegmentation | The HairSegmentation class exposes the information about a person's hair. |
PersonSegmentation | The PersonSegmentation class exposes the information about a person. |
SkinSegmentation | The SkinSegmentation class exposes the information about a person's skin. |
The ShadersModule
exposes APIs to create Visual Shaders using JavaScript.
The following is an explanation of the unique types and concepts specific to the ShadersModule.PrimitiveOrShaderSignal
is a union type of Vec2Signal
, PointSignal
, Vec4Signal
, Vec3Signal
, Mat4Signal
, or ShaderSignal
.ShaderSignal
is a graphics shader output that produces one of the types defined in the above union. As ShaderSignal
is GPU bound, it can only be used in a GPU context.ShaderSignal
can also be of a function type, used for function mapping from one type to another.
For example, a shader with the signature function(Vec2Signal): Vec4Signal
is a type of a function that maps a Vec2Signal
to a Vec4Signal
.
Enum | Description |
---|---|
BlendedMaterialTextures | The BlendedMaterialTextures enum describes the different texture slots for a flat material. |
BlendMode | The BlendMode enum describes the blending mode. |
BuiltinUniform | The BuiltinUniform enum describes the bultin shader uniforms. |
ColorSpace | The ColorSpace enum describes the color space. |
DefaultMaterialTextures | The DefaultMaterialTextures enum describes the different texture slots for a default material. |
DerivativeType | The DerivativeType enum describes the shader derivative type. |
FacePaintMaterialTextures | The FacePaintMaterialTextures enum describes the different texture slots for a face paintmaterial. |
GradientType | The GradientType enum describes the type of the shader gradient. |
PhysicallyBasedMaterialTextures | The PhysicallyBasedMaterialTextures enum describes the different texture slots for aphysically based material. |
SdfVariant | The SdfVariant enum describes the SDF variant. |
VertexAttribute | The VertexAttribute enum describes the bultin vertex attributes. |
The SvgsModule
module enables working with SVGs.
Class | Description |
---|---|
Svg | The Svg class describes an SVG in an effect. |
The TexturesModule
class enables images, animation sequences, videos, colors, and other visual artifacts to be combined to form materials.
Class | Description |
---|---|
CameraTexture | The CameraTexture class represents a texture type that contains image data coming in from system camera, or captured photo/video in case of using the effect with "Media Library". |
CanvasTexture | The CanvasTexture class enables painting with a brush to a texture. |
ColorTexture | The ColorTexture class encapsulates a texture that has a color (including alpha channel). |
DeepLinkTexture | The DeepLinkTexture class represents an image texture passed in via the sharing SDK. |
ExternalStreamTexture | The ExternalStreamTexture class represents an image texture passed in via external stream provider. |
ExternalTexture | The ExternalTexture class encapsulates a visual asset that is downloaded over the network. |
ExternalTextureMediaBase | ExternalTextureMediaBase is a base class for different types of media that can be created as ExternalTextureMdedia |
ExternalTextureMediaImage | ExternalTextureMediaImage represents image media that was created from the URL. |
ExternalTextureMediaVideo | ExternalTextureMediaVideo represents "video" media that was that was created from the URL.It exposes a set of APIs that are specifically tailored for controlling video playback. |
GalleryTexture | The GalleryTexture class encapsulates a texture that was picked from the gallery. |
GalleryTextureMediaBase | GalleryTextureMediaBase is a base class for different types of media that can be selected from gallery and used in a gallery texture. |
GalleryTextureMediaImage | GalleryTextureMediaImage represents image media that was picked from the gallery that is being used by a given GalleryTexture. |
GalleryTextureMediaVideo | GalleryTextureMediaVideo represents "video" media that was picked from the gallery that is being used by a given GalleryTexture.It exposes a set of APIs that are specifically tailored for controlling video playback. |
ImageTexture | The ImageTexture class encapsulates an image that may be used to form materials for rendering in the scene. |
PeerVideoStreamTexture | The PeerVideoStreamTexture class represents an image texture passed in via external stream provider. |
PlatformTextureMediaBase | PlatformTextureMedia is a base class for different types of media that can be used asplatform textures. |
PlatformTextureMediaImage | PlatformTextureMediaImage represents texture media that is being used by a givenPlatformTexture. |
PlatformTextureMediaVideo | PlatformTextureMediaVideo represents "video" media that was picked from the Platform that isbeing used by a given PlatformTexture. It exposes a set of APIs that are specifically tailored for controlling video playback. |
SceneDepthTexture | Describes a texture which provides the current scene's estimated depth information. Depth is calculated as the relative distance to the camera. You can follow the steps in this article to add a camera depth texture asset to your project. |
SegmentationTexture | The SegmentationTexture class encapsulates a texture that will be used for image segmentation. |
SequenceTexture | The SequenceTexture class is a collection of still images that form an animation. |
SourceImageRegionTexture | The SourceImageRegionTexture class represents a texture type that contains extracted texture image from a certain region of an object or a set of objects in the scene.E.g. Face extracted texture that contains a region of a camera image with a face as extracted using a face tracker. |
SubTexture | The SubTexture class exposes details of a texture in UV coordinates. |
TextureBase | The TextureBase class describes a texture. |
Enum | Description |
---|---|
State | The State enum describes the download state of an ExternalTexture. |
MediaType | The MediaType enum describes the media types of a GalleryTexture. |
State | The State enum describes the state of a GalleryTexture. |
TextureColorEncoding | TextureColorEncoding describes different color encoding formats used for data in a texture. |
TextureFilteringMode | TextureFilteringMode describes different modes for how a textureshould address size mismatch between the actual image data, and it's footprint when rendered. |
TextureWrapMode | TextureWrapMode describes different ways a given texture should be sampledwhen a coordinate falls outside of the 0->1 range. |
The TimeModule
class enables time-based events.
Enables detection of touch gestures and exposes classes that describe various types of touch interaction.
By default touch gestures will be registered on the entire screen unless an object is specified in the gesture method call, for example: TouchGestures.onTap(plane)
.
Importing this module automatically enables thecapability within the project's *Properties*. Gesture types must be individually enabled within the capability to enable detection.
Class | Description |
---|---|
Gesture | Exposes details of a detected gesture, common to all touch gesture types. |
LongPressGesture | Exposes details of a detected long press gesture. Ensure Long Press Gesture is enabled under the project's Touch Gestures capability. |
PanGesture | Exposes details of a detected pan gesture. Ensure Pan Gesture is enabled under the project's Touch Gestures capability. |
PinchGesture | Exposes details of a detected pinch gesture. Ensure Pinch Gesture is enabled under the project's Touch Gestures capability. |
RotateGesture | Exposes details of a detected rotation gesture. Ensure Rotate Gesture is enabled under the project's Touch Gestures capability. |
TapGesture | Exposes details of a detected tap gesture. Ensure Tap Gesture is enabled under the project's Touch Gestures capability. |
Enum | Description |
---|---|
State | The State enum describes the state of a Gesture. |
GestureType | The GestureType enum describes the type of a given Gesture . |
The UnitsModule
class provides functionality for converting values into world-space units.
The WeatherModule
class provides information about the current weather.
The WorldTrackingModule class provides functionality to track multiple surfaces in your
environment and interact with them by placing virtual content on the detected surface(s).
This class can be used to author 'World AR' effects that allow users to place and move AR
objects in the scene, as is commonly seen in AR Commerce use-cases in which users can
position couches or tables, for example. Creators are also able to visualize planes detected
in the scene.
Class | Description |
---|---|
ARPointTrackable | The ARPointTrackable class describes a point trackable, also referred to as an anchor, at a fixed location and orientation in the real world. |
ARTrackable | The ARTrackable class describes surface planes and feature points,known collectively as 'trackables', that are detected in the view. |
HitTestResult | The HitTestResult class describes a single result of the hitTest() method. |
Enum | Description |
---|---|
ARTrackableState | The ARTrackableState enum describes the states that the trackable can be in. Used byARTrackable.state . |
ARTrackableType | The ARTrackableType enum describes the types of trackable thatcan be detected and tracked. Used by ARTrackable.type . |
TrackingState | The TrackingState enum describes the states that the world tracker can be in. Used byWorldTrackingModule.state . |
TrackingStateReason | The enum describes the possible reasons why the world tracker may be experiencing limited tracking quality. It's a unified definition of TrackingStateReason regardless of the platform, e.g. arcore, arkit, first frame (FF). FF does not provide tracking failure reason, and will be default to None. |