Articles
Fundamentals
Video Calling Effects

Creating an Effect for Video Calling

In Spark AR Studio you can create an effect that multiple participants can interact with on a video call in Messenger or Instagram. Your project first has to be configured for a video calling experience.

This article will explain how to configure and test your projects for video calling. We’ll also include additional considerations to be aware of when building an effect for this experience.

Getting started with video calling effects

There are three ways to configure an effect for video calling. You can either:

  • Configure a new project.
  • Use a pre-configured template.
  • Convert an existing project.

Configuring a new project

If you’re starting a new project, in the Spark AR studio welcome screen, select Video Calling Experience. When you select this option your project is automatically set up with the capabilities to work for a video calling experience.

Spark AR welcome screen with video calling experience selected

Using a pre-configured template

Some templates in the welcome screen are already pre-configured to support a video-calling experience. Hover over the template to learn if it’s suitable for video calling.

3 templates in the welcome screen showing they're suitable for video calling

Converting an existing project

To convert an existing effect for a video calling experience, first open the effect and then:

  • In the toolbar, click the gear icon.
  • Select Change Project Properties.

This opens the Properties - Experiences window. In this window:

  1. Click Add Experience.
  2. Select Video Calling Experience.
  3. Click Insert.

Properies Experience window with video calling and sharing experience options enabled

If you want your effect to be available on Facebook and Instagram, as well as a supported video calling platform, don’t remove the Sharing Experience option. Your effect is then enabled for both experiences:

Project properties tab with video calling and sharing experience enabled

Checking supported capabilities

Keep in mind that you have to disable any existing effect’s capabilities if they can’t be used in a video calling experience. For example, if your effect uses Face Tracking and Segmentation, you’ll have to disable one of these capabilities or it can’t be published for video calling.

If you see the following warnings, make sure to update your project properties and remove the capability or change the experience type.

Capability warning stating that face tracking and segmentation can't be used in a video call

Capabilities for Instagram video calling

Similarly, some of the Spark AR capabilities usually supported by Instagram can’t be used in Instagram video calling. For example, you can create and publish an effect with the native UI slider for an Instagram sharing experience, but this capability isn’t compatible with Instagram video calling.

Optimizing your effect for video calling

There are a few adjustments you can make when building your effect to control how your effect appears. This includes showing different objects to different participants and making sure your textures appear the correct way around to the caller.

Showing different visual elements to the caller and participant

When you’re building your effect for a video call, you can configure the effect to show a different visual element to the person making the call (the caller) and the other people on the call (the participant/participants). For example, you could show an object with text in the video feed that participants see of you, but hide that card in your own self view.

The image on the left shows the caller’s self view during the video call. The image on the right shows what the caller looks like to other participants during the call.

The caller's self view and the view of the caller as seen by other participants

To control caller and participant views, first:

  1. In the Scene panel, select the object.
  2. In the Inspector and under Enable For select or deselect the following options:
  • User — when this option is selected, the object will only be visible to the caller in their self view.
  • Participants — when this option is selected, the object will only be visible to the other call participants.
  • Both User and Participants — this is the default option, the object is visible to everyone on the call.

In the example below, the 2D text object is only visible to the person making the call because only the User option is selected.

User option selected and highlighted in the Inspector and text object showing in Viewport and Simulator

Rotating textures the correct way around

When you make a video call, your camera’s video stream is always automatically mirrored horizontally when it’s displayed to other participants on the call. This means that when you build an effect for video calling you’ll need to horizontally rotate any objects containing 2D or 3D text to make sure they appear the right way around to other participants.

To mirror a text object:

  1. Select the text object.
  2. In the Inspector, under Transformations and to the right of Scale, select Per Axis from the dropdown.
  3. Set the x axis to -1.

Transformations section of the Inspector with scale set to Per Axis and x axis set to -1

In the Viewport the text object will appear backwards but will display correctly to participants in preview and in the live effect.

Viewport with text displayed backwards

Try mirroring objects and the entire camera feed in this article. There's an example project included so you can follow along.

Using dynamic layout to avoid video call layout issues

When you make a video call, each participant's Viewport size changes based on the number of participants and the layout of the video call. If you're using any on-screen elements, such as Rectangles or 2D Text, you need to make sure they're laid out dynamically to fit into the different Viewport sizes.

Here are some dynamic layout features you can use in Spark AR Studio:

  • If you are using a Canvas with screen space, the canvas size will be set to correspond to the Viewport size. See this tutorial to learn how to use pinning and set relative width and height properties, so your elements move and scale relative to the screen. You can also use 2D Stack for a more advanced dynamic layout.
  • Use the CameraInfoModule to get current screen information.

To test if your on screen elements fit the different Viewport sizes, select Resizable Window in the Simulator and resize it to make sure your effect fits in differing screen resolutions.

Previewing your effect in Spark AR Studio

To get a better idea of how your effect will appear on a video call you can switch between different views in the Simulator. To do this, click the Simulator dropdown menu icon and select one of the following options:

  • User — what you will see on the video call.
  • Participants — what the other participant on the call will see.
  • Capture — what you would see when recording a video, on Instagram, for example.

Simulator with dropdown menu options highlighted

Testing your effect

Once you’ve prepared your effect you can test it in Spark AR Desktop Player or in Messenger or Instagram. To test your effect in the app:

  1. Start a video call in the Messenger or Instagram app.
  2. In the Spark AR Studio toolbar, select the Test on device icon.
  3. Under the Video Calling Experience section, to the right of Messenger or Instagram click Send.
  4. Your effect will be uploaded to Spark AR Hub and applied to the active video call in Messenger or Instagram.

Testing popover highlighte with all options for video calling and sharing experience displayed

Publishing your effect

When you’re happy with your effect you can export it from Spark AR studio and submit it to be published in Spark AR Hub. When you submit it in AR Hub, make sure to select All Platforms or Video Calling as your chosen platform.

Performance limitations

Video calling is power intensive even when AR effects are not running. AR effects intended for video calls need to be more efficient than those used for a typical sharing use case such as stories or reels. Keep the following in mind when building your effect:

  • Video calling effects generally run significantly longer than sharing effects. Use GPU and CPU sparingly.
  • You can't use face tracking and segmentation at the same time and you should be mindful about using multiple capabilities in your effect.
  • Take note of Spark AR Studio warnings about unused capabilities — even unused capabilities can cause the runtime to load unnecessary components and impact performance and battery life.
Was this article helpful?