Information technology

Here’s what I made of Snap’s new augmented reality Spectacles [MIT Tech Review]

View Article on MIT Tech Review

Before I get to Snap’s new Spectacles, a confession: I have a long history of putting goofy new things on my face and liking it. Back in 2011, I tried on Sony’s head-mounted 3-d glasses and, apparently, enjoyed them. Sort of. At the beginning of 2013, I was enamored with a Kickstarter project I saw at CES called Oculus Rift. I then spent the better part of the year with Google’s ridiculous Glass on my face and thought it was the future. Microsoft HoloLens? Loved it. Google Cardboard? Totally normal. Apple Vision Pro? A breakthrough, baby. 

Anyway. Snap announced a new version of its Spectacles today. These are AR glasses that could finally deliver on the promises devices like Magic Leap, or HoloLens, or even Google Glass made many years ago. I got to try them out a couple of weeks ago. They are pretty great! (But also: see above)

The new 5th generation Spectacles can display visual information and applications directly on their see through lenses, making objects appear as if they are in the real world, with an interface powered by the company’s new operating system, Snap OS. Unlike typical VR headsets or spatial computing devices, these augmented reality (AR) lenses don’t obscure your vision and recreate it with cameras. There is no screen covering your field of view. Instead, images appear to float and exist in three dimensions in the world around you; hovering in the air or resting on tables and floors.

Snap CTO Bobby Murphy described their intention to MIT Technology Review as “computing overlaid on the world that enhances our experience of the people in the places that are around us, rather than isolating us or taking us out of that experience.” 

In my demo, I was able to stack LEGO on a table, smack an AR golf ball into a hole across the room (at least a triple bogey), paint flowers and vines across the ceilings and walls using my hands, and ask questions about the objects I was looking at and receive answers from Snap’s virtual AI chatbot. There was even a little virtual purple dog-like creature from Niantic, a Peridot, that followed me around the room, and outside onto a balcony. 

But look up from the LEGO and you see a normal room. The golf ball is on the floor, not a virtual golf course. The Peridot perches on a real balcony railing. Crucially, this means you can maintain contact – including eye contact – with the people around you in the room. 

To accomplish all this, Snap packed a lot of tech into the frames. There are two processors embedded inside so all the compute happens in the glasses themselves. Cooling chambers in the sides did an effective job of dissipating heat in my demo. Four cameras capture the world around you, as well as the movement of your hands for gesture tracking. The images are displayed via micro-projectors, similar to those found in pico projectors, that do a nice job of presenting those three dimensional images right in front of your eyes without requiring a lot of initial setup. It creates a tall, deep field of view – Snap claims it is similar to a 100 inch display at 10 feet – in a relatively small, lightweight device (226 grams). What’s more, they automatically darken when you step outside, so they work well not just in your home but out in the world.

You control all this via a combination of voice and hand gestures, most of which came pretty naturally to me. You can pinch to select objects and drag them around, for example. The AI chatbot could respond to questions posed in natural language (“what’s that ship I see in the distance?”). Some of the interactions require a phone – but for the most part Spectacles are a standalone device. 

It doesn’t come cheap. Snap isn’t selling the glasses directly to consumers, but rather requires you to agree to at least one year of paying $99 per month for a Spectacles Developer Program account which gives you access to them. I was assured that it has a very open definition of who can develop for the platform. Snap also announced a new partnership with OpenAI that takes advantage of its multi-modal capabilities, which it says will help developers create experiences with real-world context about the things people see or hear (or say).

The author of the post standing outside wearing oversize Snap Spectacles. The photo is a bit goofy
It me.

Having said that, it all worked together impressively well. The three-dimensional objects maintained a sense of permanence in the spaces you placed them — meaning you can move around and they stay put. The AI assistant correctly identified everything I asked it to. There were some glitches here and there, LEGO bricks collapsing into each other for example, but for the most part this was a solid little device. 

It is not, however, a low-profile one. No one will mistake these for a normal pair of glasses or sunglasses. A colleague described them as beefed up 3D glasses, which seems about right. They are not the silliest computer I have put on my face, but they didn’t exactly make me feel like a cool guy either. Here’s a photo of me trying them out. Draw your own conclusions.



Leave a Reply