VR (Virtual Reality) and AR (Augmented Reality) have been at the development forefront for years, with big technology companies and startups working on building their own XR software and hardware.
However, the problem was the relatively fragmented nature of this development. The space needed a ubiquitous platform that could be used by different companies and developers to build products and apps based on a single ecosystem, much like what we see in the Android world. That’s where Qualcomm Technologies, Inc. steps up to the plate with Snapdragon Spaces™ Technology, offering a simple and convenient way for developers to jump into the world of XR! Here’s what you need to know.
What is XR?
Before jumping into all the exciting features Snapdragon Spaces has to offer, let’s talk about XR (Extended Reality), which is the umbrella term that encompasses VR, AR, and MR (Mixed Reality).
VR and AR are more commonly known. In the case of the former, you find yourself in an entirely computer-generated space, with accompanying hardware allowing you to interact with objects in the virtual world. On the other hand, AR brings the real world into the mix, with a transparent display letting you see everything in the physical space, with augmented projections adding information and details to enhance the experience.
Finally, there’s MR, which combines the best of both worlds. It is also entirely computer-generated but takes images from the real world and incorporates them into a VR environment to create a mixed reality.
While there are three distinct types of computer-based “reality,” they all fall under the overarching umbrella term “XR.”
Qualcomm leads the charge with Snapdragon Spaces
Snapdragon Spaces is brought to you by Qualcomm, who is truly at the forefront of the XR space. We already have products that use Qualcomm’s XR processors, most recently, Meta Quest 3 powered by Snapdragon® XR2 Gen 2 Platform and Ray-Ban Meta smart glass collection powered by Snapdragon AR1 Platform. While the chips are essential — they give you the processing power and power efficiency to run different “reality” environments — Qualcomm wants to do much more than just make processors.
On the hardware side of things, Qualcomm helps to create reference devices, which their partners can use to quickly go through the prototyping phase and build commercially available products. At the same time, developers can take advantage of the hardware development kits to test their apps. These hardware kits offer more than just processing power. Beyond just a powerful CPU and GPU, they also include sensors and other technology for connectivity, tracking hand movements, plane detection, and everything else needed to build and use XR apps.
And, of course, Qualcomm’s involvement doesn’t end with hardware. Snapdragon Spaces is the software development kit that developers can use to build XR apps for various products, from headworn devices to smartphones. This was initially only used by Qualcomm’s partners to develop their headsets, but now, the company has opened it up to all creators. Qualcomm is fully involved in scaling the ecosystem, providing developers with software support, updates, new features, and everything else required to build XR apps.
If you are a developer interested in making XR apps for a variety of headsets that use Snapdragon Spaces, are powered by Qualcomm’s processors, or use Android, the Snapdragon Spaces SDK is precisely what you need.
Snapdragon Spaces makes creating XR apps easy
Starting to work in a technology space still in its nascent stages can be daunting. As a developer interested in XR, you might be worried that you have to start at the very beginning with low-level libraries and build things from scratch.
But Snapdragon Spaces uses Unreal Engine or Unity, tools that hundreds of thousands of developers worldwide use to generate 3D games. And since you can now use either to create XR apps, you don’t need to relearn all the tools or start from scratch.
Built into the Snapdragon Spaces SDR for Unreal Engine or Unity are a variety of technologies that you can use — like anchor points, hand tracking, object tracking, plane detection, and more. Anchor points let you track a particular point in the space, so you know exactly where it is in a virtual, augmented, or mixed 3D environment. Hand tracking, an essential aspect of any XR world, lets you track what your hands, or controllers, are doing and interact inside the space.
There’s also image or object tracking and plane detection, which is particularly useful for Augmented Reality and Mixed Reality setups. Images can be projected onto flat, horizontal, or vertical spaces, taking into account the physical “3D” environment, and there’s so much you can do around that, from object modeling to games.
The important thing is that all of this is built into the SDK and available to developers within their tool of choice. That means that you can spend your time being creative when making your XR apps instead of worrying about building the supporting software from the ground up at a lower level. You can trust the SDK to handle all that and focus on your inspiration to make something great.
A fantastic new feature: Dual-render fusion
As mentioned, Qualcomm is very involved in supporting and scaling the ecosystem, which means new features! An exciting recent addition to the SDK is called Dual Render Fusion.
Many current-generation headworn devices utilize a combination of the headset and an Android smartphone, connected via a wire or wirelessly. With this setup, the secondary device, a smartphone, can do the bulk of the heavy lifting in terms of processing power to allow for lighter, slimmer, and easier-to-wear headsets.
With dual-render fusion, you can now add a second display instead of the secondary device being used for just its processing. As a developer, you now have two screens you can use to display something on both the Android smartphone and the headset.
What is really interesting is that this also serves as a stepping stone from a regular 3D smartphone app to an XR app that can be viewed on a headset. You can add the second screen and move your smartphone app or app features to an XR environment with this extra feature instead of rebuilding an XR app from the beginning.
For example, if you are playing a game on your smartphone and are wearing an AR headset, you can get extra information from the game, like your inventory or game stats, on your headset while continuing to play the game on your phone. The opposite is also true, where your headset is the primary screen, with extra details and information appearing on your smartphone. All this can be done without needing to re-develop the game specifically for AR or MR.
Dual-render fusion gives developers an easy way to jump from smartphone apps to XR apps without starting again from scratch.
Make the jump to XR with Snapdragon Spaces
Often, with any new technology space, a gap between hardware and software is difficult to bridge because of a lack of support. How would you create an app for hardware that isn’t readily available? Or when the hardware is accessible, developers don’t have the software platform to build apps for it.
Qualcomm solves this problem by tackling the issue on both sides. Snapdragon Spaces gives you a standard way to develop apps for XR, building on existing 3D tools like Unreal Engine or Unity and including necessary features like plane detection, object tracking, hand tracking, and more, to help avoid a steep learning curve. There are also fantastic new features being added all the time, like dual-render fusion, which lets you jump from smartphone apps to XR apps without needing to re-create anything from scratch.
And there are a bunch of devices you can test them on, from hardware reference devices available from Qualcomm to Snapdragon Spaces ready Android smartphones like the OnePlus 11. This really narrows the gap between software and hardware and enables developers to leap into the full potential we see from XR apps.
Sponsored by Qualcomm Technologies, Inc. Snapdragon and Qualcomm branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.