Meta's 2026 smart glasses will reportedly boost Live AI and identify faces

A Meta AI logo on a laptop with a pair of Ray-Ban Meta Smart Glasses in front
(Image credit: Nicholas Sutrich / Android Central)

What you need to know

  • A report claims Meta is pushing ahead with its planned 2026 smart glasses, which could upgrade Live AI to last longer.
  • The company is also looking into "super sensing," which is a host of AI features that could include facial recognition software.
  • A previous report detailed Meta's plans for its Hypernova glasses, which are reportedly preparing to debut in 2025 with a singular display on its right lens.

New reports claim Meta's next planned pair of smart glasses could take its AI facial recognition to new heights.

Word about what Meta's next smart glasses could provide stems from The Information, who state there's a wealth of AI behind the lenses (via UploadVR). According to the report, Meta is supposedly pursuing an upgraded version of its "Live AI." Per the publication, Live AI is being looked at for Meta's 2026 smart glasses, and is designed to "see what its user does and responds in real-time, for hours."

This is designed to offer quick, hands-free help for users without needing to call the AI forward with its voice prompt. However, reports add that Meta is also looking into facial recognition software for its next glasses.

This feature is reportedly under Meta's internal "super sensing" codename. The post cites sources close to the matter who claim the company wants its super-sensing to boost the "popularity" of its products. Here's the thing about super sensing: it would enable the smart glasses to keep their cameras and sensors active at all times.

What's more, sources allege the AI would "remember" what all it saw (through the user).

The AI could even recognize certain important items like, say, your car or home keys. If this sounds familiar, it's because Google demoed such a feature for Android XR called "memory." The folks at UploadVR speculate this facial recognition could enable the AI to surface the name of someone you're speaking to. This is likely based on stored information the glasses have from the user.

Hypernova & Beyond

Special edition transparent Ray-Ban Meta Smart Glasses with their black leather charging case

(Image credit: Nicholas Sutrich / Android Central)

There are, of course, privacy concerns with an AI that's always active, viewing what you're doing in your daily life. The report suggests this feature will be an "opt-in" feature that users can freely use and then toss whenever. Additionally, these features are reportedly geared toward Meta's 2026 smart glasses: Aperol and Bellini.

Meta already debuted Live AI last month in April. The feature is on its way to its current Ray-Ban smart glasses, bringing a conversational experience. In short, Live AI lets users ask about anything or hold "in-depth conversations with Meta AI without the vocal prompt. The publication's report notes that this feature only lasts for 30 minutes at a time on the current string of glasses.

As previously stated, Meta will look to extend those capabilities for a few "hours."

After this reported glimpse of 2026, we should come back to 2025 as another report may have revealed Meta's plans for its Hypernova glasses. Hypernova is said to deliver a singular display on the bottom right of its right-side lens. The glasses are reportedly preparing to feature a full set of Android-powered computational hardware and a gesture band.

The band is said to make it easier to manage what's on your single display.

Nickolas Diaz
News Writer

Nickolas is always excited about tech and getting his hands on it. Writing for him can vary from delivering the latest tech story to scribbling in his journal. When Nickolas isn't hitting a story, he's often grinding away at a game or chilling with a book in his hand.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.