Snap, the social media company turned AR glasses-maker, has officially announced Snap OS 2.0, its next generation of smart glasses software.
And after testing it out for myself at Snap’s London offices last week, I have high hopes for the release of their first consumer-ready smartglasses in eight years, which are set to arrive in 2026.
2016’s original Snap Spectacles marked a notable shift for what had always been a software-first company up until that point: changing the perspective from your hands to your face, and letting you capture the world and augment it back on your phone before sharing with the world.
After a couple more iterations on this same idea, Snap changed tack and reimagined Spectacles for developers first. The form-factor switched from a camera on your face to a self-contained computer on your face, using cameras to sense the environment around you and track 3D objects within that space.
Last year’s revised Spectacles added more compute and heralded the arrival of Snap OS, with the company using the intervening months to flesh out some of the fundamental experiences within its platform, with the help of its developer partners.
My time with Snap OS 2.0 was the company’s chance to showcase what the collective effect of these recent months of refinement looks and feels like to use, as well as granting the best look yet at what 2026’s Spectacles will serve up.
Here are three things I really enjoyed using, plus one area that Snap still needs to address before the final release.
Connected sessions are amazing
Snap has been vocal about the most significant upgrades to Snap OS between version 1.0 and 2.0 over the past year, but one of the most natural and impressive experiences I trialled was its Connected Sessions.
Working with Snap’s Augmented Reality Engineer, Andreas Müller, I was able to join a Connected Session he was hosting via his Spectacles based solely on proximity. Once connected, we were both able to move around the same space and paint 3D shapes in the air together, with impressively low hand-tracking latency and cross-device latency.
While the premise of the demo was simple enough and I’ve had similar multiplayer VR experiences over an internet connection, having two AR users in the same physical space and two pairs of Spectacles tracking their collective interactions in real-time, locally, was decidedly more impressive.
The real win, however, is the potential this feature adds to Snap’s AR offerings, and once its army of 400,000 developers builds Connected Session support into their experiences, it could change local collaboration (and by extension multiplayer gaming) in ways we’ve only seen in tech demos thus far.
Powerful Spotlight, Browser and Gallery features
Three of the main additions as part of Snap OS 2.0 are Spotlight, Browser and Gallery. The former lets you enjoy vertical content and interact with it, just as you would on the Snapchat smartphone app. Despite interaction moving from your phone screen to a virtual window, floating within 3D space, I was impressed with how easily swipes and taps translated to gestures and pinches, while playback was smooth and audio synced.
Snap announced the inclusion of WebXR support at the tail end of 2024, meaning developers should have an easier time constructing AR experiences to work within the new Browser app. As you might expect, you can type with a floating AR keyboard or use speech-to-text to search, while navigation is equally intuitive and gesture-based as on Spotlight.
The current Spectacles resolution and brightness also ensured reading text on web pages was perfectly manageable in the bright room the demo took place in, and video playback on the likes of YouTube worked well too, provided you’re comfortable with blacks being displayed as transparent.
There’s also now finally a way to view content captured on Snapchat or on the Spectacles themselves, with a chronological overview that I was able to logically swipe through. Support for stereoscopic captured content meant I could enjoy 3D media, too.
Impressive AI sound and vision
As well as AR experiences that are placed into the world around you, Spectacles’ latest AI integrations let you ask about elements within the world around you.
I trialled Snap OS’ Spatial Tips feature by looking at a skateboard on a shelf in front of me and asking how to do an ollie, at which point it overlaid instructions over the board relative to the next step in the process of achieving the trick in question.
Super Travel, meanwhile, let me hold a Chinese restaurant menu up and drag a bounding box around it with my hand, at which point it translated the text within the box into English and overlaid it next to the physical menu, so I could compare like for like and understand if I was about to order Chow Mein or silkworms.
Lower profile Spectacles are needed
One of the most notable aspects in the race for face-worn AR supremacy has to be form factor. Snap’s decision to place practically all the processing responsibility on the dual Snapdragon chips inside its latest Spectacles has rendered them wearable, but not exactly ergonomic.

Foundry | Alex Walker-Todd
I found them fairly comfortable but proportionally and aesthetically awkward. They sit too far off the face, especially at the point at which the hinges meet the arms, along with two bulbous stems at the ends of those arms (which I assume contain the batteries).
That’s not to detract from how impressive it is that Snap has managed to fit so much hardware inside these things. But if the company wants to translate the experience I was given access to into eyewear that people will feel comfortable wearing out in public, it’s going to need to work on making them more compact.
However, as far as the tech within goes, there’s plenty to be excited about.