As we work toward a future where AR blends our digital and physical world, we’re investing in technology and tooling now that will enable new expressions of creativity, with greater technical control and higher fidelity. It’s with this future in mind that we’re excited to roll out two new AR capabilities for Instagram today, including multi-class segmentation and an improved target tracker.
Both of these capabilities open up some fun, new AR experiences, and are available now with version 114 of Spark AR Studio. Let’s dig in and show you a little more:
Now you can combine multiple segmentation textures, including hair, skin and person, together in the same effect. We’re also making it easier to control multiple texture properties, like edge softness, which creates an outline around the person in your scene, and mask size, which adjusts the size of the outline.
This new capability will provide you with the flexibility to create more complex AR experiences, ranging from fantastical character effects, to high fidelity clothing effects. Here are just a few examples of what’s possible with the new multi-class segmentation capability:
Spark AR - Multi-Class Segmentation
Sample effects from @piotar_boa and @enuriru
There are some limitations to consider when using segmentation in your effects: they often work better when identifying people from the chest upward, low lighting tends to affect camera accuracy, and these types of effects usually perform best on newer mobile devices. Be sure to check out today’s Tech@ blog post which goes into more detail on the camera recognition AI we’re using to power this new capability, and what our Facebook and Instagram teams will be doing to mitigate potential abuse of skin segmentation.
As always, we have updated documentation available today for reference, and be sure to also check out our step-by-step tutorial on how to create a hair segmentation effect. Lastly, we’re rolling out a new template that uses multi-class segmentation to produce a modern-art-inspired effect. Use this for inspiration or simply as a shortcut for gathering up assets and organizing your own project.
We’re making improvements to the target tracker, which triggers an AR effect when a user points their camera at a fixed 2D image in the real world, like a poster or a sign. As part of today’s update, the target tracker can now detect multiple fixed or moving target images to trigger an AR effect.
When you choose to use a moving target image, your effect will move as the target image moves. And if the user’s camera isn’t pointing at the target, your effect will simply disappear. You can still choose to use a fixed target image as well, and your effect will appear in the same position where the user’s camera first detected it, and it will stay fixed in this position, even when the user moves their camera. Here are just a few early examples of what’s possible with the updated target tracker:
Spark AR - Updated Target Tracker
Sample effects from @afrosquared and @enuriru
The target tracker will now track up to five unique images in a scene, however consideration for performance across mobile devices is recommended. Additionally, with the performance improvements in today’s target tracker release, we recommend updating previous effects that used target trackers, even if you’re not making any other changes. To do this, you’ll need to open your project file in the latest version of Spark AR Studio and re-publish your effect.
You can learn more about how to optimally configure and use the target tracker in our updated documentation. We also have a new template available today that nicely bundles everything you need to create an animated 3D poster.
We’re excited to see creators get access to both of these new capabilities today, and eager to see what you create! As always, we encourage everyone to share their experience and show their AR effects on the Spark AR Community Facebook Group.
Subscribe to the Meta Spark Blog