Cameras were staring at me all over Meta’s Menlo Park campus. I’m not talking about security cameras or my fellow reporters’ DSLRs. I’m not even talking about smartphones. I mean Ray-Ban and Meta’s smart glasses, which Meta hopes we will all wear – one day, in some form or another.
I visited Meta this year for the Connect conference, where almost every hardware product involved cameras. They are on the Ray-Ban Meta smart glasses that have received a software update, the new Quest 3S virtual reality headset and Meta’s prototype Orion AR glasses. Orion is what Meta calls a “time machine”: a functioning example of what full-fledged AR could look like, years before it’s ready for consumers.
But at least on Meta’s campus, Ray-Bans were already everywhere. It was a different kind of time machine: a glimpse into CEO Mark Zuckerberg’s future world, where glasses are the new phones.
I’m conflicted about it.
Meta really wants to put cameras in your face. The glasses, which follow 2021’s Ray-Ban Stories, appear to be making an appearance on that front, as Zuckerberg said The edge sales are going ‘very well’. They are not full-fledged AR glasses because they do not have a screen to display information, although they are becoming more powerful with AI features. But they’re perfect for what the entire Meta empire was built on: encouraging people to share their lives online.
The glasses are available in several classic Ray-Ban styles, but for now it’s clear that users aren’t just wearing glasses. As I wandered around campus, I noticed the telltale signs on person after person: two prominent circle cutouts on the edges of their glasses, one for a 12MP ultra-wide camera and the other for an indicator light.
This light flashes when a user is taking photos and videos and is generally visible even in sunlight. In theory, that should have reassured me: if the lights weren’t on, I could trust that no one was taking footage of me having lunch before my meetings.
But as I talked to people on campus, I was always a little tense. I found myself acutely aware of those circles, checking to see if anyone was filming me while I wasn’t paying attention. Just the potential of a recording would distract me from conversations, with a low hum of background anxiety.
When I put on a pair for myself, the situation changed
Then, when I put on a pair for myself, the situation suddenly changed. Being a possible target for shooting, I hesitated, fearing that I might be photographed or filmed as a byproduct of making polite eye contact. However, with the glasses on my own face, I felt like I had to record more. There’s something very compelling about the experience of an eye-level camera. By simply pressing a button on the glasses, I could take a photo or video of everything I saw, exactly from the angle I saw it. No awkward effort of pulling out my phone and hoping the moment lasted. There may be no better way to share my reality with other people.
Meta smart glasses have been around for a few years now, and I’m hardly the first person – or even the first person The edge – to be impressed by them. But this was the first time I saw these glasses not as an early adopter technology, but as a ubiquitous product like a phone or smartwatch. I got a sense of how this seamless recording would work on a large scale, and the prospect is both exciting and terrifying.
The camera phone was a revolution in itself, and we’re still grappling with its social effects. Almost anyone can now document police brutality or capture a fleeting funny moment, as well as take creepshots and post them online or (to be clear: a much lesser offense) annoy people at concerts. What will happen when even the minimal friction of pulling out a phone disappears, and billions of people can instantly take a photo of everything they see?
Personally, I can see how incredibly useful this would be for taking candid photos of my new baby, who is already starting to recognize when a phone is taking a photo of her. But it’s not hard to imagine much more malicious uses. Sure, you might think that we’ve all gotten used to everyone pointing their phone cameras at everything, but I’m not sure that’s a good thing; I don’t like the possibility of me ending up in someone’s TikTok just because I left home. (The rise of advanced facial recognition makes the risks even greater.) With the ubiquitous cameras with glasses, I feel like my face is even more likely to appear somewhere on the Internet without my consent.
There are also clear risks associated with integrating cameras into what is an undeniable vision aid for many people. If you already wear glasses and are switching to prescription smart glasses, you’ll either have to bring a low-tech backup or accept that it will remain in some potentially very awkward places, like a public toilet. The current Ray-Ban Meta glasses are largely sunglasses, so probably not most people’s primary glasses. But you can get them with clear and transition lenses, and I bet Meta would like to market them more as everyday specs.
Of course, there’s no guarantee that most people will buy them. The Ray-Ban Meta glasses are pretty good gadgets now, but I was on Meta’s campus meeting Meta employees to check out Meta hardware for a Meta event. Not surprisingly, Meta’s latest hardware was commonplace, and it doesn’t necessarily tell us much about what people outside that world want.
Camera glasses have been just above the horizon for years. Remember how magical I said taking pictures of what’s right before your eyes is? My former colleague Sean O’Kane shared almost the exact same experience with Snap Spectacles back in 2016.
But Meta is the first company to credibly try to drive mainstream adoption. They’re a lot of fun – and that scares me a little.