Meta Spark

Expanding abilities
All-new possibilities

Create more immersive AR effects using new audio, depth and occlusion capabilities

By: Meta Spark
April 25, 2022
Meta Spark

Expanding abilities
All-new possibilities

Create more immersive AR effects using new audio, depth and occlusion capabilities

By: Meta Spark
April 25, 2022

Today we’re excited to be rolling out several new capabilities with the release of version 136 of Spark AR Studio, which includes more audio options, new depth mapping features, plus improvements in access and controls for occlusions.

These latest capabilities, in combination with recent updates like hand and body tracking, are all engineered toward a single purpose — to help you build the most creative, complex and immersive AR experiences possible using Spark AR.

Something Went Wrong
We're having trouble playing this video.

New audio capabilities arrive

As part of today’s update, we’re rolling out a new audio engine that enables more advanced audio integrations in your effects. This includes improvements in audio processing which will make it easier for you to blend multiple audio sources together to create fun and richly layered audio effects. For example, now you can more easily add voice effects, sound effects and music tracks together to create engaging and entertaining dance or singing effects for Reels. Available today only for AR effects on Instagram.

Additionally, we’re introducing six new patches, including Mixer, Gain, Oscillator, Vocoder, Filter, and Compressor, plus a collection of new asset patches in Spark AR Library, including Audio Fade, Pulse Limiter, Loop Player, and many more — all designed to help you customize and control a wide range of audio elements in your AR effects. Here’s a quick overview of what’s possible now with the new audio capabilities.

Audio capability

Lastly, we’re releasing two new templates today called Piano Project and Audio Visualizer. Use these new templates to help jumpstart your next effect, or to simply test and explore your own audio ideas.

We think the new audio engine, patches, patch assets and templates are a powerful combination that will open up some exciting new possibilities, enabling you to build multisensory effects that will help people use sight and sound to feel more immersed in your AR experience.

Create depth-responsive effects

Machine perception is a fundamental part of modern day camera tech, especially for smartphones. It’s what helps your phone detect and interpret a simple scene to take a better photo. And now with the addition of LiDAR sensors, mobile cameras are increasingly capable of capturing depth and distance data too.

That’s why we’re excited to be releasing a new depth mapping capability today called Camera Depth Texture. This capability allows you to detect the relative distance of surfaces and objects from the camera, and extract this data as a texture. You can then use the data contained with this texture to create a variety of effects that respond to depth, such as post-processing and lighting effects.

Simplified occlusion controls

Additionally, we’re making it easier to access and customize Spark AR Studio’s occlusion feature, which allows you blend virtual objects into a real-world space. With occlusion, you can give virtual objects a more believable sense of space, by partially obscuring them with other objects or by completely hiding them from a user’s field of view. We are looking forward to all users being able to explore this feature, previously only available to select creators. Here are just a few quick examples of what’s possible now with depth mapping and occlusion.

Depth mapping/occlusion capability video

As part of today’s update, we’re also releasing a new template called Depth Color Overlay, which you can use to create a colorful sonar-like pulse effect.

This combination of improvements in depth mapping and occlusion control, opens up some really exciting new possibilities, especially if you’re a creator eager to develop more world AR-building skills that can help you create experiences for AR glasses in the future.

All of these new capabilities and a few additional updates, are available today by updating Spark AR Studio to version 136, or you can download the latest version now to get started.

As always, we encourage you to join the Spark AR Community Facebook Group to find inspiration, share your work and connect with AR creators all over the world. You can also follow us on Facebook, Instagram, and subscribe via email to get all of the latest Spark AR news and updates.

Subscribe to the Meta Spark Blog