Constructing a cross-platform runtime for AR
By Paul Wu, Nikita Lutsenko
Meta’s augmented reality (AR) platform is one of the largest in the world, helping the billions of people on Meta’s apps experience AR every day and giving hundreds of thousands of creators a means to express themselves Meta’s AR tools are unique because they can be used on a wide variety of devices — from mixed-reality headset like MetaQuest Pro to phones, as well as lower-end devices that are much more prevalent in low-connectivity parts of the world.
How it works:
To achieve this, we’re focused on performance optimization. We give all creators the option to mix and match various AR capabilities as they please, somewhat like LEGO bricks, to create and deliver unique experiences. As creators focus on building amazing experiences, we focus on the complexity of optimizing assets and runtime so these experiences can run everywhere, from mobile all the way to advanced hardware and VR. One way we do so is by deliberately splitting our monolithic runtime into smaller plugins. This way, if an app doesn’t require a specific capability it can easily be excluded using just a quick configuration toggle. We’re continuously looking for new opportunities like this to further expand our platform’s reach and support more use cases with a single AR engine at the core.
Why it Matters:
At Meta, our AR engine group works to ensure that our AR services are available for everyone, regardless of the device they’re using. AR and VR experiences shouldn’t be restricted to the most sophisticated devices but should be widely accessible to all.
Take a Deeper Dive:
Read more about how Meta is bringing augmented reality to everyone.
And watch the Products @Scale talk below on “Building a Cross-platform Runtime for AR Experience.”
Comments are closed.