Meta’s new AI glasses review: Stellar sound, sharp recording, subject none the wiser
The wearable tech represents power to the user – and apprehension to everyone else
[SINGAPORE] Meta’s new artificial intelligence glasses have arrived. They look cool. They play music, tell you today’s headlines and help you reply to WhatsApp messages on the go. They can also take high-quality pictures and videos – discreetly. And that, is probably its most concerning feature.
But, more on that later.
Two models, which differ in shape and size, are launching on May 6 in partnership with eyewear giant and Ray-Ban owner, EssilorLuxottica. What sets the Ray-Ban Meta Blayzer Optics (Gen 2) and Ray-Ban Meta Scriber Optics (Gen 2) apart is that they’re made for prescription wearers – which, as a bespectacled colleague says, “changes everything”.
To make them more suitable for all-day wear, the glasses weigh around 50 g, and come with interchangeable nose pads, overextension hinges and adjustable temple tips for a better fit.
But why pay upwards of S$699 for a pair?
What nice glasses you have
For Meta, wearables are how you deliver “personal superintelligence” to everyone. For the end-user, it means hands-free convenience.
Navigate Asia in
a new global order
Get the insights delivered to your inbox.
Hooked up to your smartphone via Bluetooth, simply use voice commands to access Meta’s multimodal generative AI as well as other apps on your phone. To be clear, these glasses do not come with visual display.
The set-up is easy and intuitive. Then the fun starts – beginning with the AI assistant’s language options (Mandarin is not yet available) and accents. There are a few celebrity AI voices, including English actress Judi Dench’s, which sounds posh but also rather grim. American counterpart, Awkwafina, on the other hand, is surprisingly suitable – lively, cheerful and spoken at just the right clip.
Six microphones are built into the glasses, while audio comes out of open-ear speakers under the spectacle arms – a plus, as they do not reduce environmental awareness the way earbuds do. The audio quality is superb – immersive and crystal clear, whether you’re listening to music or taking a call. It’s also non-intrusive. Others can only hear it if they’re next to you in a quiet room, and even then, all they pick up is a very faint sound.
SEE ALSO
Just like built-in, voice-activated AI assistants available on other devices, you can ask the glasses for the weather forecast, today’s Singapore dollar exchange rate with the greenback and even who’s winning the Iran War.
“It is a complex issue and it’s difficult to say who is winning,” answers Meta AI aka Awkwafina. “Both sides claim victory but experts describe it as a strategic stalemate.”
When requested, she reads out the day’s major headlines, including China’s blocking of Meta’s acquisition of AI startup, Manus. Oops.
However, you can only connect to some apps, such as Spotify, as well as those owned by the American tech giant – WhatsApp, Facebook, Instagram and Messenger. Not Telegram, nor your Gmail account.
I like that the eyewear can alert me to incoming WhatsApp messages, read them and help send my replies via verbal commands – though I felt a little self-conscious muttering to myself while out and about.
Sending messages to someone else from my contact list, however, can be a challenge – it keeps getting the recipient’s name wrong.
There are also patchy results from some other functions. The glasses could identify a bowl of noodles with dumplings and some vegetables (“A cosy meal!” Awkwafina added chirpily). But it couldn’t identify a bottle of cookies nor, stranger still, geolocate correctly.
With location services and camera turned on, you’d think the glasses could immediately place me in the vicinity of Toa Payoh, as I stood in front of a block of flats. But no, it insisted I was in Bishan one day and Shenton Way the next.
Some functions still require assistance – ask for directions to the nearest MRT station, and it’ll send them to your phone. Also, it couldn’t play from specific playlists stored in my Spotify library, so, out came the phone again.
While it’s supposed to be able to deliver eight to 10 hours of runtime, the battery was depleted in half that time after I took 14 videos of no longer than three minutes each (maximum recording time is five minutes) and asked Meta AI a bunch of questions.
The better to see you with, my dear
One of the glasses’ more impressive features is its 12-megapixel camera that can take 3K photos and videos. It’s mounted on the left corner of the frame, with an LED on the right. When you snap a picture or record a video – using voice command or the capture button on the right arm of the spectacles – the light blinks. Brightly.
And just like that, good quality, point of view pictures and videos (complete with sharp audio) are saved and sent to my phone. To prevent surreptitious filming, the glasses won’t let you use the camera if the LED is obscured. But once recording has started, I found it can continue even with the light covered.
When asked, the glasses also did a good job summarising an e-mail I was looking at. There was an audible click – a picture was taken and sent to Meta’s cloud for analysis. I made a mental note not to look at my bank account.
So, what’s the difference between using my phone and my glasses to ask AI for assistance or take pictures and videos? For starters, it’s easy to give the eyewear more multimodal data than you might when using your phone. And the fact that phones are obtrusive while the glasses aren’t, make all the difference.
Over three days, I tested it out at lunch with a group of friends, at dinner in a restaurant, in cafes and coffee shops and in an MRT train. Granted, not everyone made eye contact, but others spoke directly with me. Yet, not a single person questioned my blinking glasses.
Surprisingly, nobody even noticed it. And assuming some of the strangers did, perhaps they thought it was something innocuous, like a Bluetooth connection. In any event, there are stickers out there you can buy to slide over the LED.
Eventually, I found someone who admitted he noticed the light while we were talking. He wondered if it was a camera, but wasn’t bothered enough to ask. We also agreed that the non-confrontational nature of many Singaporeans means users can take pictures and film mostly unhindered.
After confessing that I’d filmed our lunchtime conversation, the first question someone asked was: “Did I say anything that could be used against me?”
Another joked that one could pretend to drop it on the escalator to shoot upskirt videos. That released a floodgate of silly, voyeuristic ideas. We laughed. But in the hands of bad actors, these things – and worse – could happen.
Despite privacy concerns and the wearables’ imperfections, smart glasses are quickly gaining traction. Last year, Meta sold over seven million pairs, more than triple the combined sales of 2023 and 2024. Indeed, they could benefit those with impaired eyesight – and content creators.
Nevertheless, the nagging issue of data privacy needs to be properly addressed. It doesn’t help, as a recent New York Times Magazine article points out, that we don’t know exactly how AI works. With AI models having an estimated trillions of mathematical functions today in their neural networks, they can baffle their own creators, it says.
And now, we have the possibility of putting all that on our noses.
Available in Ray-Ban stores, EssilorLuxottica retail locations including Spectacle Hut, and optical retail partner stores across Singapore.
Decoding Asia newsletter: your guide to navigating Asia in a new global order. Sign up here to get Decoding Asia newsletter. Delivered to your inbox. Free.
Copyright SPH Media. All rights reserved.
TRENDING NOW
JLL retrenches some Singapore staff following restructuring
How China’s young workers are securing their future even as AI disrupts job market, triggers pay cuts
‘We’re not a bubble tea brand’: Chagee aims to double Asia-Pacific footprint to 600 stores by 2027
Middle East-linked energy supply shocks put Asean Power Grid back in focus