If you see someone nearby wearing thick Ray-Ban glasses, maybe staring off into space a bit and making small gestures with their fingers, you could be witness to the next big piece of wearable tech. Gesture-enabled smart glasses are here in the form of Meta Ray-Ban Displays, and I’ve been wearing them for about two weeks now, off and on. Yes, I’m one of those people.
Will you eventually be one of those people, too? Well, start by asking yourself whether you even want a display hovering around near your eyes, able to be called into existence with a double tap of a middle finger and thumb. Do I? Yes and no.
Watch this: My Life With Meta Ray-Ban Displays: A Weird Wild Future
Meta’s latest $800 glasses feel to me like a transformational gadget for life. At their best, they reveal magic glimpses of a subtle interface, another layer of information on the world, with a display that conjures itself on demand. At their worst, they highlight the numerous missing pieces still needed to make smart glasses truly essential. Including, by the way, prescription support for my eyes. Right now, I’m testing them with contact lenses on.
Also, I have fundamental concerns about distraction and safety while wearing them.
7.0
Meta Ray-Ban Display
Like
Nearly invisible in-lens heads-up display
Impressive gesture controls via included wristband
Assistive captioning and maps apps are truly useful
Viewfinder and zoom functions for taking photos and videos
Don’t like
Shorter battery life than standard, display-free Meta Ray-Bans
Neural band can feel annoying to wear
Few apps and phone-connected functions
Can’t mirror phone on the display, just certain apps
As they currently exist, the Meta display glasses I review here are not as useful as a smartwatch or as good a value as display-free smart glasses, such as the standard Meta Ray-Bans. Even if you want to be one of those people, you should probably wait until this impressive technology matures a bit.
But the technologies inside these glasses — near-invisible display tech provided by reflective waveguides, and wild neural band technology driven by electromyography — are going to show up in more glasses and wearables eventually. It’s early days for significant advancements in tech for our wrists and faces, and while these glasses are technical achievements, they currently feel like a beta test of things to come.
The tiny screen embedded in Meta’s Ray-Ban Display glasses is only visible to the wearer. It’s controlled by gestures that are sensed by the included neural wristband.
Now I feel like an everyday cyborg
On the surface, the new display glasses look a lot like Meta’s existing audio and camera-enabled Ray-Ban and Oakley smart glasses. The difference is that they have a heads-up display in one eye to show apps and information, along with a gesture-enabled wristband you wear to control the display.Â
The display in Meta’s Display Glasses isn’t as capable as a VR headset or Tony Stark specs, however. You won’t see 3D things in these glasses. All they do is project a single display in a single eye, flat and 2D. Essentially, they’re like an evolved pair of Google Glass (circa 2013) for the modern age.
The really advanced ideas come on the included Neural Band, which can register subtle hand gestures. Little taps and swipes allow me to control what’s on the screen, like a clickable mouse made of my fingers. Subtle vibrations give me feedback as I tap.
The wrist and display upgrades on these glasses make the whole experience feel equal parts futuristic and odd. I’m sort of an everyday cyborg who can summon screen readouts into my vision. Sometimes it feels like my life has become a first-person video game. Other times it feels like I’ve glued a smartwatch to my face or given my eye Apple CarPlay.
Meta Ray-Ban Display glasses on my face, transition lenses activating. Do you notice I’m recording?
Looks: Subtle, yet not that subtle
Compared with some other augmented reality specs and smart glasses I’ve tried, Meta’s look surprisingly stylish for big, chunky glasses. The frames come in either black or semi-transparent sand-colored (brown) and make the standard Ray-Bans look slim and low-key by comparison. I love the way these glasses look on my face, but for the record, my family doesn’t.
They’re heavier than non-display Ray-Bans, but not by much (69 grams, compared to 49 grams). They still feel premium, solid and comfortable to wear. The thick arms have hinge springs that bend back to reduce tension on the sides of my head, and they don’t exert pressure on my temples.
It’s pretty amazing how the display inside the right lens isn’t visible to outside users who look at me, even when it’s on.Â
When it’s off, you have to look closely to make out the reflective waveguide tech that creates the image. It looks like a series of small lines on the side of the lens. There’s also a narrow vertical strip down the side of the lens that’s visible at certain angles. The waveguide tech here is a lot better than anything else I’ve seen, and a sign of how invisible in-glass displays could be.
That tech has a major downside, however: Meta can only make these Ray-Bans with prescriptions that range from minus 4 to plus 4. My eyes are over minus 8. I’ve had to wear contact lenses to test them for this review, defeating the whole idea of these actually being my everyday glasses. I hope Meta can figure out how to work with more prescriptions — not just for myself, but for anyone else hoping to buy them. Signs are strong that will happen for this type of lens tech, but exactly when remains a mystery.
My two hands now: watch on left, gesture-controlliing neural band on right. (Shot on Meta Ray-Ban Display glasses and cropped.)
Neural band: Amazing and also awkward
To control these glasses, Meta invented a whole new wearable that looks like a screenless fitness tracker. Called the Meta Neural Band, this fabric-covered device has an array of flat electrodes on the inside that push gently against my wrist, measuring electrical impulses via EMG (electromyography) technology.Â
These signals interpret one-handed gestures I make to control the glasses’ screen. You’re supposed to wear the band tightly on your dominant wrist, above your wristbone, higher up than a normal smartwatch. The magnetic clasp tightens easily, though, and feels like a fitness band.Â
The band has no function other than controlling the glasses. It charges with its own magnetic pin cable, has water resistance for splashes but not for swimming, and lasts about a day on a charge.Â
I hoped it would feel like I had magical powers to control what I saw. In practice, the magic works in bits and pieces. The wristband only recognizes a narrow set of gestures, which control all the navigation on the display as I move between windows and apps.Â
Learning the gestures takes effort. Double-tapping my middle finger and thumb summons the display, and other gestures go back or confirm a selection. To choose an app, I have to swipe my thumb and then tap my forefinger and thumb. After a while, all that movement can sometimes make my hand cramp.
Small actions can be subtle and fascinating. Double-tapping my thumb on my closed fist activates Meta AI voice prompts, allowing me to quietly ask about things, take a photo, play music or read a message. Or playing music, a quick tap, maybe a finger pinch and twist for volume. I can do these even when my hands are by my sides, walking.Â
Sometimes the gestures don’t activate — when holding a grocery bag with that hand, holding a steering wheel or reaching in my pocket. Sometimes I activate them accidentally, like I did during a ZDNet podcast when, gesturing with my hands, I kept activating Meta AI.
You could use the right arm of the glasses instead of the band. It has a trackpad that scrolls in multiple directions and has single- and dual-finger touch gestures. It’s awkward, however. The band, which comes with the glasses, still feels essential if you’re really going to try living with them.
We got our camera behind the glasses to show what live captioning really looks like. It’s pretty wild.
AI powers: I can caption real lifeÂ
Meta AI is designed to answer questions, open apps, send messages or even use the cameras to analyze something in front of me and attempt to translate or describe it to me. These AI functions are pretty much exactly the same as what’s on Meta’s other Ray-Ban and Oakley glasses, with the same hit-and-miss accuracy. But now, I can also see responses on-screen in text and sometimes graphic form.Â
One feature unique to these glasses is assistive captioning, and it might be the most magical feature of all. It uses the microphones to focus only on the speaker in front of me and translates what they say to text that appears in the display moments later. I can see people wanting these just for the captioning alone.
But for continuous AI analysis of the world using the cameras, you have to activate Live AI mode, which drains the battery very quickly. Expect an hour or less of use that way, versus up to 6 hours otherwise in more casual modes.
Maps can pop up in-display and show navigation in some cities, too. Truly useful, although potentially distracting.
Display apps: Few and far between, but signs of magic
For all that Meta’s promising a transforming future of world-aware AI glasses, there’s not a ton of hyper-intelligent new stuff going on when I’m wearing them. They’re mostly bringing up a dashboard of certain go-to apps on demand, overlaid for me to scan quickly. Unlike Meta’s future promises of contextual AI that can truly know what you need at any moment, a lot of my use is more deliberate, like a smartwatch.
The color display in these glasses is high-res, crisp and detailed. It’s also ghostly looking, both because it’s semi-transparent and it’s only in one eye. Reading it with one eye was fine, but it made me wish for a wider field of view.Â
Playing a little Starship Troopers, as I do on deadline.
It’s also visible even in bright daylight thanks to transition lenses in the glasses. The sunglass mode activates quickly, and I’ve been able to see messages even in the brightest head-on sun.Â
I mainly used the display for things like quick readouts, thumbnails of photos or a map to glance at. It’s not a full dashboard for my phone, and I can’t use it to playback videos. When I listen to a Jets game played via Bluetooth audio from the NFL app on my phone, I can’t see the game itself. In that sense, these aren’t display glasses like Xreal or Viture glasses that actually mirror your whole phone via USB-C.
A little idea of what the heads-up camera viewfinder feels like when wearing the glasses. You can zoom in with your fingers.
The 10 apps Meta has on these glasses are all Meta-made, and it shows. Facebook Messenger, Instagram and WhatsApp are the primary ways to chat or have live video calls, where someone could also see your camera feed. There’s also a basic music player that works with Apple Music, Amazon or Spotify.Â
You can look through photos and videos taken on the glasses, or use the camera app to get a live viewfinder, and even zoom in on your shot digitally by pinch-and-twisting your fingers. It’s a wild idea, but the digital zoom can feel buggy and a bit hard to control.
An onboard maps app is fascinating, and can bring turn-by-turn directions to my eyes as I walk or even while driving. The pop-up turn indicators seem useful as I walk through my town, but not necessarily any more than directions from earbuds or my smartwatch.
Overall, the collection of apps is no substitute for my phone, and Meta hasn’t even built deeper hooks into its own apps like Facebook or Messenger. Google’s expected wave of AI-enabled glasses coming next year could better at accessing Android phones, at least. Meta needs to figure out how to extend its glasses feature set and apps while navigating Apple and Google’s garden walls and app stores.
The glasses and the neural band both need charging. The band lasts a whole day, but the glasses only last several hours.
Battery life: Now there are two more things to charge
If you’re like me you already have a lot of things to charge every day: a phone, a smartwatch, a pair of earbuds, maybe. Meta adding two more – the glasses and the neural band – feels like a lot.
To charge, you snap the glasses into a collapsable carrying case with a battery pack and USB-C port. The glasses charge quickly, but battery life lasts only 2 to 6 hours in my everyday use so far. That’s less than Meta’s screen-free second-gen Ray-Bans. Sooner or later, I need to recharge during the day, which means carrying a second pair of glasses.
Luckily, since I’m wearing contacts to test these, that’s no big deal. But if these were my everyday glasses, it wouldn’t be great.
The neural band, meanwhile, has its own special charge cable and lasts up to a full day on a charge. While that helps my charge stress, I still need to manage the glasses. I find myself looking at the battery status for band and glasses through the day now, just like I do my phone.Â
There should be an easier way to charge these glasses on the fly, using swappable batteries or a tethered cable. Until these can achieve a full day of battery life, they’ll hold them back as a true life assistant.
Privacy is a total unknown, safety is a concern
Meta’s one of the worst of the big tech companies when it comes to handling data and privacy. Meta tends to suck up data for unclear purposes or for serving ads (which don’t appear on these glasses at all, yet).Â
Many people I know are hesitant to use Meta glasses at all for these reasons, and I get it. I also don’t know how Meta will handle the evolution of more advanced world-aware AI on these glasses down the road.
Could I sneak a photo of you using these glasses? Yes, and more easily than before, since now I can trigger the camera subtly with my fingers by my side. There’s still an LED light that goes off when the camera’s in use, but it’s easy to miss, especially in bright daylight.
I’m also concerned about safety. Having a display on my face while walking or especially while driving is a potentially serious distraction. There is a driving awareness mode and an audio-only mode for the glasses, but that’s not activated by default. The glasses made no automatic recommendations for deactivating the display when driving, something I think Meta should add immediately.
How will Meta make these glasses work better with our phones, and our lives?
The future is more AI companies aiming for your face
The template for what Meta is showing off for these glasses isn’t some out-on-a-limb concept. Google, Amazon and Apple are all expected to have glasses of their own in the next couple of years, mixing in heads-up displays and more AI-assisted features, possibly adding wrist-based controls or hand gestures, too.
Meta has plans to turn these into fully augmented reality devices capable of layering 3D into the world, like the prototype Orion glasses I tried last year. Ray-Ban Displays aren’t like that yet, but they’re also the first of their kind.
I think of my time with Ray-Ban Displays like life with those early days of smartwatches that felt nearly ready to be on our wrists all the time. Nearly, but not quite. These Display glasses are like prototypes, but the landscape is changing fast, and Meta will need to perfect the next generation further. It could happen as soon as next year. And by that time, many other pairs of glasses will be ready for your eyes, too. While they’re the most advanced smart glasses out there right now, they’re not the most practical ones to wear.
In the meantime, I recommend the display-free Ray-Bans for most people. If you’re ready to be an early adopter of neural wrist tech for $800, dive right in. Personally, I think my eyes need a bit of a break.
Read the full article here