After using Meta's Ray-Ban glasses, I'm willing to bet AI is the key to their future

Meta Ray-Ban smart glasses
(Image credit: Future)

There's a new generation of smart glasses upon us and Meta is at the center.

On top of Oiron — Meta's prototype for the world's first pair of full-featured holographic glasses — it also announced a major update to its Meta Ray-Ban glasses last week during Connect. 

Instead of telling you what's new, I'll give you one guess what the biggest addition to the Meta Ray-Ban smart glasses was... 

If your answer is AI, congratulations, you won tech's easiest guessing game. 

And sure, it's no surprise that Meta, like pretty much every tech company, is introducing AI into its glasses, but just because the move is expected doesn't mean it's not significant.

AI may not be appropriate for every product, but it could be an absolute game-changer for smart glasses.

Meta AI but make it fashion

Meta Ray-Ban smart glasses

(Image credit: Future)

In case you haven't noticed, AI is showing up everywhere: in Google search, in Photoshop, and in your email inbox.

Naturally, some of those applications aren't going to pan out — right now, there's a "throw AI at the wall and see if it sticks" mentality.

But some of those AI uses will stick, and after briefly using Meta's Ray-Ban glasses, I'm willing to bet smart glasses are one of those arenas.

There are many hurdles to making smart glasses — miniaturization is one that comes to mind — but devising a new input method will be among the top priorities. If smart glasses are to be the "next thing" after smartphones, they'll need to be fully featured but also as intuitive to use as our beloved glass slabs.

And they'll need a UI that people understand to be easy to use. The problem? Well, there is no touchscreen on a pair of smart glasses. In the future, smart glasses could adopt some hand/eye tracking input similar to the Apple Vision Pro, but right now, that tech is nowhere near viable in a form factor so small.

For now, we have one thing — voice assistants. Meta has already tapped that potential since its newest Ray-Ban update comes pre-loaded with Meta AI. I got a brief chance to use Meta AI at a recent event, and though it's still very much a work in progress, I can see the appeal.

Meta Orion smart glasses

(Image credit: Meta)

The infusion of large language models (LLMs) promises to make voice assistants much more savvy. By nature, LLMs are better at understanding natural language prompts and are capable of executing more complex, multi-step commands.

That means, for example, you can put on your Ray-Ban smart glasses, look at a recipe written in French, and say, "Hey Meta, translate this recipe for me." I actually did this in Meta's demo space.

The results were maybe not perfect — I had to really focus my vision on the card, and the AI didn't quite hear my prompt in a loud room — but with some tinkering, we got there.

With a more capable, AI-supercharged voice assistant, smart glasses might not immediately need a more fine-tuned, complex UI. Want to launch music? Take a picture? Set a reminder or an alarm? Just shout. Meta AI is already promising to go beyond the banal alarm-setting and app-launching of our current crop of voice assistants.

For instance, Meta AI can identify objects in your field of view and give you information about them. That means you could have your "what are those?!" moment with a pair of shoes or maybe look at a piece of art and have your glasses give you more background on who made it.

It's still early days, but this kind of neural connection between computer vision and our neverending stream of information on the web feels genuinely novel but also (with refinement) actually practical.

The future starts now

There's a lot to do between now and our Orion future, but Meta AI will clearly be a stepping stone.

In the same way that Meta AI promises to simplify UI on headsets like the Quest, it also stands to pave the way for early-stage smart glasses like the Meta Ray-Bans.

I'm not sold on the usefulness of LLMs, but if there's one thing they're exceptional at, it's understanding us. It's taken a long time for voice assistants to mature, but with LLMs, we could finally get there.

Both Amazon and Apple have promised far more functional versions of their Alexa and Siri voice assistants, respectively, and Meta's AI has already started imbuing some of that LLM firepower.

AI might not open the floodgates for smart glasses, but it might just convince us that they're legitimately worth our investment.

James Pero
Senior News Editor

James is Senior News Editor for Laptop Mag. He previously covered technology at Inverse and Input. He's written about everything from AI, to phones, and electric mobility and likes to make unlistenable rock music with GarageBand in his downtime. Outside of work, you can find him roving New York City on a never-ending quest to find the cheapest dive bar.