Here's what it's like to use Meta's new Ray-Ban Display glasses

By Clare Duffy, CNN
Menlo Park, Calif. (CNN) — Have you ever wished you could quickly and quietly respond to a text while in a movie theatre without pulling out your phone. Or that you could see directions to the closest coffee shop without having to look down at your maps app? A new pair of Meta smart glasses with a tiny display inside the lens makes those things possible.
Meta (META) unveiled the Meta Ray-Ban Display glasses Wednesday at its annual Connect event at its headquarters in Menlo Park, California, along with several updated versions of its more basic, audio-only Ray-Ban and Oakley smart frames.
It’s all part of Meta’s push to develop devices for the artificial intelligence era. In demos of the new glasses at the event, I saw firsthand how they could make it possible to spend less time looking down at your phone — thanks to a tiny screen an inch from my eyeball.
“Glasses are the only form factor where you can let AI see what you see, hear what you hear, talk to you throughout the day… so it is no surprise that AI glasses are taking off,” Meta CEO Mark Zuckerberg said at the Connect event. He added that the company is seeing consumers adopt smart glasses at a rate similar to that of “some of the most popular consumer electronics of all time.”
But are consumers ready to shell out $799 for Meta’s smart glasses that represent an early step toward augmented reality? Here are my takeaways after trying them out.
What are the Meta Ray-Ban Display glasses?
Unlike previous versions of Meta’s smart glasses, which users could interact with only by talking to them and getting audio responses, the Displays provide visual feedback via a small display on the inside, righthand corner of the right lens.
When you’re wearing the frames, the display looks like it’s projected several feet in front of you. You have to focus on it to really see what’s there — otherwise, it kind of floats in your peripheral vision when you’re looking at things in the real world. (You can also turn the display off, if you want to do focused work or have a conversation without the distraction of the screen.)
Only the wearer can see the display. Talking to someone who is wearing the Ray-Ban Displays you’d have no indication if they had messages popping up or Instagram Reels scrolling on the side, except that they might break direct eye contact with you to look at the display.
That would probably bother me if I encountered Displays in the real world. Who wants to have a heartfelt conversation with a friend and not know whether they’re watching texts pop up at the same time? But then again, for many, technology has long been part of interpersonal interactions.
One plus of the Displays compared with Meta’s audio-only Ray-Bans: They come with a “neural” wristband, which lets you navigate the display with subtle hand gestures, so you don’t have to touch the frames or say “Hey, Meta!” out loud to get the device to do something.
And while they’re slightly heavier and thicker than regular glasses — or earlier Meta Ray-Bans models — it’s easy enough to forget you’re wearing them when you’re not engaging with the tech.
What can you do with them?
The display lets you do a whole lot of things you might usually pick up your phone for, including taking and viewing photos and videos, reading and responding to messages, scrolling Instagram Reels and even taking video calls.
The display is small, so it’s probably not the first place you’ll turn to to scroll through your vacation photos. But if you wanted to take a photo, do a quick quality check and post it right away, you could knock all of that out on the Displays.
Perhaps the most useful feature is one that can show you where you are on a map in real-time. It’s a step up from earlier versions of the Meta smart glasses, which I’ve previously asked for directions to the closest grocery store, for example: While they’ll give you the street address, they can’t tell you how to get there.
There are also live captioning and translation that let you read the words your conversation partner is saying in real-time, on the screen. And when you ask Meta AI a question, it will respond with written information cards on the display, in addition to an audio answer. As a plant nerd, I find this tool very useful for identifying foliage and then asking follow-up questions, like whether the plant would survive in a pot in a New York City apartment.
Still, it’s not yet a perfect system. The glasses misspelled a word when I was trying to reply to a text by dictating my response and, without a keypad, I basically just had to start over to fix it.
Similarly, in a demo of the Meta Ray-Ban Displays, Zuckerberg struggled to answer a video call from Chief Technology Officer Andrew Bosworth because the button to accept the call didn’t show up on the display. “We’ll debug that later,” Bosworth said.
Do people really want this?
The Meta Ray-Ban Displays glasses are undoubtedly impressive. It’s a technology that Google first envisioned with Google Glass more than a decade ago, but now it’s more functional, fashionable and accessible via intuitive hand gestures.
But I still have some big questions about whether this will really catch on. Namely: at a time when so many people I know are looking to spend as much time as possible away from screens, will consumers really want a screen they can keep on constantly inches from their retinas?
But ask Meta executives and they say that concern was actually a motivating factor behind the development of the company’s newest glasses.
“We built this product to help protect presence,” Ankit Brahmbhatt, Meta’s director of AI glasses, told CNN. “At first, that might sound counterintuitive but we designed this to be a glanceable display, so that it’s there for you when you need it, you get in for seconds at a time, you get the information, then it’s kind of out of your way.”
He added: “This idea of being more heads up and not having our heads buried in our phones, I think, is a really a big part of the kind of experience that we’re trying to unlock here.”
As with earlier generations of the Meta Ray-Bans, I also wonder how comfortable a broad swath of users will be with wearing glasses that can see and hear their surroundings while in their home or workplace or while they are spending time with kids — especially given Meta’s tainted history as a steward of our personal information.
Likewise, I feel unsure whether those who don’t wear smart glasses would like being around someone wearing a device with a camera and microphones. In 2013, some Google Glass wearers gained the nickname “glasshole” because other people took issue with the prospect of being filmed without their consent.
Like previous versions, the Meta Ray-Bans Display glasses have an LED light indicating that the device is recording.
And Brahmbhatt said “building responsibly” is a focus for the company, adding that “there’s just education that’s needed when you have a new category” for the public to understand the built-in safety features like the “recording” light.
The first proof of just how much convincing Meta will have to do will come on September 30, when the glasses go on sale in a limited set of retail stores in the United States, before they roll out globally next year.
The-CNN-Wire
™ & © 2025 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.