If you want to understand where the future of VR and AR lies, start by looking at the chips. Qualcomm’s chipsets have been driving most VR and AR devices for years now, often indicating what headsets will adopt before they actually adopt it. Qualcomm’s new Snapdragon XR2 Gen 2 and AR1 Gen 1 chipsets, which debut on the Meta’s new Quest 3 and its updated Ray-Ban glasses, are worth paying attention to because they aren’t exclusive to Meta’s products. They’re going to appear lots of other places too.
As Apple readies its Vision Pro hardware for next year, the rest of the VR/AR (or XR, spatial computing, whatever else you choose to call it) landscape is moving alongside it. Samsung, Google and Qualcomm are preparing some sort of mixed reality hardware platform. Qualcomm is also working with game developer Niantic on some sort of AR glasses and is still partnered with Microsoft on future mixed reality software/hardware beyond Hololens. Qualcomm has a partnership with Meta where its AR/VR chips appear on Meta hardware first, but they won’t stop there: think HTC, Pico, and others.
We spoke with Qualcomm’s XR head, Hugo Swart, to get perspective on what the changes mean for products to come.
VR’s getting a mixed reality and AI boost
The new Snapdragon XR2 Gen 2 chipset’s greatest strength, besides boosted graphics and processing power, is its improved passthrough mixed reality. It’s a trend among headsets right now: instead of trying to make augmented reality (virtual objects and audio seeming to appear in the real world) happen by floating images over transparent displays, the current wave of VR headsets now display video of the real world captured with their cameras and layering graphics onto that to create an AR-like effect without see-through lenses.
Some existing headsets do this like the Varjo XR-3 and Aero and the HTC Vive XR Elite. Apple’s Vision Pro does this too and so does the Meta Quest 3, Meta’s next-gen VR headset coming in October.
But, the XR2 Gen 2 looks like it can do more than what the Quest 3 is using it for. The Quest 3 doesn’t have eye or face tracking, but the XR2 Gen 2 can track up to 10 cameras or sensors at the same time: full-body trackers, face and eye tracking, external cameras, and more could easily be mixed onto future headsets. It looks like a chip that’s much more ready to have multiple sensors running at once.
Fitness possibilities with accessories
Some of that extra bandwidth could be applied to working better with accessories, in particular, fitness. Swart sees advances in AI processing on the XR2 Gen 2 chip as helping with smarter virtual training. “You can get much better knowledge of the movements that the user is performing,” he said. “And then have guidance, if you have a virtual trainer, that not only is telling you what to do, but can also monitor what you’re doing. And then make recommendations.”
Qualcomm is already exploring dovetailing VR, AR and phones. The relationship could expand to other accessories too, and the XR2 Gen 2 chip might be better equipped to handle those multiple relationships. “It can be a dedicated accessory. Or it could have a smartwatch that could have a tie in to the experience. Or sensors,” Swart said. “I strongly believe in the potential of these things coming together.”
Qualcomm is already exploring a prototype of VR fitness using a phone camera for added tracking, Swart said. “We’re working on a project that’s fitness based, where you can use the phone as a camera,” he said. “You put the back of the phone facing you while you do a fitness exercise session in VR, capturing the movements that the user is doing. We probably will start seeing bigger traditional fitness and well-being brands adding XR functionalities to their services.”
Hand-tracking improvements for mixed reality
While Meta hasn’t demonstrated big changes to hand tracking yet on the Quest 3, it’s likely to come. “We have eight times the AI capacity [on this chipset]. And I think hand tracking is one example that really benefits from AI,” Swart said, but points to the simultaneous mix of head-hand tracking and environmental awareness as the more challenging mix. “When you talk about the mapping environments, mapping objects, identifying objects in their environment, things moving in front of you doing an experience, it’s a lot of things that individually, maybe I could do on an older platform, but to have it all running concurrently, that’s the magic.”
And this year’s popular tech, AI, also plays an interesting wildcard both for AR and VR. Swart sees AI playing a major role in how more spontaneous experiences in mixed reality function. Metaverse-focused companies like Roblox are already leaning on it.
“The holy grail would be, I get into a room and immediately, with gen AI, it creates a new world for me in the room,” Swart said. “But I think simple things, like, I’m engaging with a digital avatar NPC and have a conversation with that NPC, that’s the kind of technology that is here today.”
Qualcomm’s AR1 Gen 1 chip: a middle path for smart glasses
Qualcomm announced another chipset, also debuting in a new Meta product: Snapdragon AR 1 Gen 1. Meta’s newest Ray-Ban camera- and audio-equipped smart glasses take advantage of some but not all of the chipset’s capabilities. Despite the “AR” name, this chipset isn’t focused on 3D visuals at all. Instead, it’s focused on better camera, audio and AI performance on smaller smart glasses that now look just about normal. Meta’s newest Ray-Ban glasses are a prime example. They have better microphones and better cameras but no displays. Qualcomm’s AR 1 Gen 1 chip can support displays, though, up to 1,280×1,280-pixel resolution per eye.
The new glasses chipset also supports up to eight microphones (Meta used five), Wi-Fi 7 and a lot of AI improvements on-glasses: text-based visual search like Google Lens, and AI-assisted noise canceling, which could lead to audio tricks like those Apple’s AirPods Pro 2 can do with adaptive audio.
Qualcomm is pitching its new AI improvements for things like voice-based assistants, navigation, object recognition and other types of ideas that sound like what Google Glass had dreamed of years ago. There are also possibilities with fitness sensors. This use of fitness sensors is more a theoretical possibility thanks to the new processing bandwidth on the chipset, Swart said, which makes me wonder when sensor-equipped fitness glasses will start emerging. It makes a lot of sense.
What about AR glasses?
One thing this new glasses chipset won’t do is full-movement 6DoF (six degrees of freedom) 3D AR. Qualcomm has another, higher-powered AR glasses chip revealed last year, the AR 2 Gen 1, which is capable of doing those things — and yet, so far, no glasses have emerged using that chip.
“It is taking a bit longer than expected to get those [AR glasses] products out,” Swart said. “The silicon is not the only blocker for that type of glasses…I think we have to look into display and optics, and that’s one of the areas that is impeding devices to come faster to market.”
Swart sees higher-end AR glasses development happening “in parallel” with more streamlined non-AR glasses, ones that will get people comfortable with the idea ahead of more advanced hardware down the road.
It also looks like VR headsets are being co-opted for now as the way to try AR for most people. The new Quest 3 leans on mixed reality as its one truly big new feature, and Apple’s Vision Pro is doing the same.
It could mean a few years where VR headsets are the place where AR ideas are explored, before AR glasses can eventually take over and do it in smaller sizes. Of course, that all depends on how the hardware develops. For now, it shows that 2023 and onward will still be about VR headsets and smart glasses, possibly still separate and still with future dreams of merging.
Is one of these chips what Samsung, Google and Qualcomm’s mystery product runs on?
The expected next wave of VR and AR devices is ongoing: besides Apple’s Vision Pro, Samsung, Google and Qualcomm have some mystery device platform that’s expected to be unveiled either later this year or next year. According to Samsung, it’s likely to be something that connects to Android phones.
It seems likely that that product would run off Qualcomm’s newest XR2 Gen 2 chipset, but of course, Swart isn’t commenting either way. “I’m not going to be able to say much on that here,” he said. But when I ask about the future of phones interacting with upcoming headsets, there is one key feature Swart does comment on.
“This kind of concept of bringing 2D content from the phone, the PC and just organizing it in your field of view, I think is very interesting,” he said. “My 16-year-old is a heavy gamer, and he has three monitors, plus his phone, and they’re all looking through different apps. There’s a game here, there’s Discord, there’s Snap. You could easily put that in a visor. I don’t need to have this huge setup in his room. I could just put on a visor and have all these 2D screens organized in that field of view.”
Is this a hint of where Google’s place in Samsung’s product could lie? Apple’s Vision Pro has emphasized running thousands of iOS apps on a virtual display. Samsung and Google and Qualcomm could be demonstrating something similar for Android and Google Play too. It’s the missing part of the VR/AR ecosystem, and maybe the next phase isn’t about rethinking what VR or AR is: it’s about putting it all together with everything else.