How Meta’s New Face Camera Heralds a New Age of Surveillance
For the past two weeks, I’ve been using a new camera to secretly snap photos and record videos of strangers in parks, on trains, inside stores and at restaurants. (I promise it was all in the name of journalism.) I wasn’t hiding the camera, but I was wearing it, and no one noticed.
I was testing the recently released $300 Ray-Ban Meta glasses that Mark Zuckerberg’s social networking empire made in collaboration with the iconic eyewear maker. The high-tech glasses include a camera for shooting photos and videos, and an array of speakers and microphones for listening to music and talking on the phone.
The glasses, Meta says, can help you “live in the moment” while sharing what you see with the world. You can livestream a concert on Instagram while watching the performance, for instance, as opposed to holding up a phone. That’s a humble goal, but it is part of a broader ambition in Silicon Valley to shift computing away from smartphone and computer screens and toward our faces.
Meta, Apple and Magic Leap have all been hyping mixed-reality headsets that use cameras to allow their software to interact with objects in the real world. On Tuesday, Mr. Zuckerberg posted a video on Instagram demonstrating how the smart glasses could use A.I. to scan a shirt and help him pick out a pair of matching pants. Wearable face computers, the companies say, could eventually change the way we live and work. For Apple, which is preparing to release its first high-tech goggles, the $3,500 Vision Pro headset, next year, a pair of smart glasses that look nice and accomplish interesting tasks are the end goal.
For the past seven years, headsets have remained unpopular, largely because they are bulky and aesthetically off-putting. The minimalist design of the Ray-Ban Meta glasses represent how smart glasses might look one day if they succeed (though past lightweight wearables, such as the Google Glass from a decade ago and the Spectacles sunglasses released by Snap in 2016, were flops). Sleek, lightweight and satisfyingly hip, the Meta glasses blend effortlessly into the quotidian. No one — not even my editor, who was aware I was writing this column — could tell them apart from ordinary glasses, and everyone was blissfully unaware of being photographed.
After wearing the Ray-Ban Meta glasses practically nonstop this month, I was relieved to remove them. While I was impressed with the comfortable, stylish design of the glasses, I felt bothered by the implications for our privacy. I’m also concerned about how smart glasses may broadly affect our ability to focus. Even when I wasn’t using any of the features, I felt distracted while wearing them. But the main problem is that the glasses don’t do much we can’t already do with phones.
Meta said in a statement that privacy was top of mind when designing the glasses. “We know if we’re going to normalize smart glasses in everyday life, privacy has to come first and be integrated into everything we do,” the company said.
I wore the glasses and took hundreds of photos and videos while doing all sorts of activities in my daily life — working, cooking, hiking, rock climbing, driving a car and riding a scooter — to assess how smart glasses might affect us going forward. Here’s how that went.
My first test with the glasses was to wear them at my bouldering gym, recording how I maneuvered through routes in real-time and sharing the videos with my climbing pals.
I was surprised to find that my climbing, overall, was worse than normal. When recording a climbing attempt, I fumbled with my footwork and fell. This was disappointing because I had successfully climbed the same route before. Perhaps the pressure to record and broadcast a smooth climb made me do worse. After removing the glasses, I completed the route.
This feeling of distraction persisted in other aspects of my daily life. I had problems concentrating while driving a car or riding a scooter. Not only was I constantly bracing myself for opportunities to shoot video, but the reflection from other car headlights emitted a harsh, blue strobe effect through the eyeglass lenses. Meta’s safety manual for the Ray-Bans advises people to stay focused while driving, but it doesn’t mention the glare from headlights.
While doing work on a computer, the glasses felt unnecessary because there was rarely anything worth photographing at my desk, but a part of my mind constantly felt preoccupied by the possibility.
Ben Long, a photography teacher in San Francisco, said he was skeptical about the premise of the Meta glasses helping people remain present.
“If you’ve got the camera with you, you’re immediately not in the moment,” he said. “Now you’re wondering, Is this something I can present and record?”
Privacy Eroded
To inform people that they are being photographed, the Ray-Ban Meta glasses include a tiny LED light embedded in the right frame to indicate when the device is recording. When a photo is snapped, it flashes momentarily. When a video is recording, it is continuously illuminated.
As I shot 200 photos and videos with the glasses in public, including on BART trains, on hiking trails and in parks, no one looked at the LED light or confronted me about it. And why would they? It would be rude to comment on a stranger’s glasses, let alone stare at them.
The issue of widespread surveillance isn’t particularly new. The ubiquity of smartphones, doorbell cameras and dashcams makes it likely that you are being recorded anywhere you go. But Chris Gilliard, an independent privacy scholar who has studied the effects of surveillance technologies, said that cameras hidden inside smart glasses would most likely enable bad actors — like the people shooting sneaky photos of others at the gym — to do more harm.
“What these things do is they don’t make possible something that was impossible,” he said. “They make easy something that was less easy.”
Albert Aydin, a Meta spokesman, said the company took privacy seriously and designed safety measures, including a tamper-detection technology, to prevent users from covering up the LED light with tape.
In other mundane situations, the Ray-Ban Meta glasses affected me in strange ways. While I was about to cross a driveway in my neighborhood, I saw a car begin to reverse into it. My immediate reaction was to press the record button in case I needed to capture the driver acting irresponsibly. But he yielded appropriately and I crossed, feeling sheepish.
Slice of Life Moments
Although the Ray-Ban Meta glasses didn’t make me feel more present or more safe, they were good at capturing a particular type of photo — the slice-of-life moments I wouldn’t normally record because my hands would be preoccupied.
With the glasses, I shot video of my corgi, Max, barking mightily to go out for a walk as I tied my shoes — a side of him that his Instagram followers don’t normally see. I recorded video of my dogs and wife as we hiked a trail, which would normally be difficult to do with a smartphone while keeping my hands steady. While slicing some leftover meat to make lunch, I recorded my Labrador, Mochi, watching me with hungry eyes.
The footage had a dreamy quality — the camera looked as if it were floating as I moved around. My wife and I agreed that we would look back at the videos of our dogs fondly. But while these types of moments are truly precious, that benefit probably won’t be enough to convince a vast majority of consumers to buy smart glasses and wear them regularly, given the potential costs of lost privacy and distraction.
It’s easy to imagine, however, some apps that could make smart glasses eventually go mainstream. A holographic teleprompter showing talking points in the corner of your eye while giving presentations, for example, would be killer. Whether that product is eventually developed by Meta or even Apple, which is hoping to make smart glasses after its Vision Pro headset, that future doesn’t feel too far away.
Discover more from Divya Bharat 🇮🇳
Subscribe to get the latest posts sent to your email.