Meta’s ‘Hypernova’ AI smart specs will be an ‘experimental’ device

    By Trevor Mogg
Published August 26, 2025

Meta is highly likely to unveil advanced “Hypernova” smart glasses at its annual Connect event when it kicks off on September 17.

But despite the hype surrounding the rumored product, the often-reliable analyst Ming-Chi Kuo suggested on Tuesday that the incoming smart specs will be an “experimental” device for Meta, with limited shipment expectations of just 150,000 to 200,000 units in the two years following their launch, which should take place before the end of this year.

In a post on X, Kuo said that global shipments of smart glasses are forecast to hit around 14 million units next year, indicating that Hypernova’s market share will be negligible, thereby “positioning it more as an experimental product for Meta.”

The analyst said artificial intelligence (AI) will be Hypernova’s “most important selling point,” but added that the integration of AI with augmented reality (AI) applications “is still in its nascent stages” and therefore could hold it back.

Kuo attributed Meta’s conservative shipment forecast for Hypernova mainly to the device’s limited AI capabilities, combined with its estimated price of around $800.

Citing another hurdle, the analyst said that the decision to use Liquid Crystal on Silicon (LCoS) microdisplay technology to achieve scalability for mass production “presents hardware design challenges related to form factor, brightness, response time, and battery life.”

Kuo concluded that Meta’s strategic goals for releasing Hypernova this year are: “1) To launch ahead of Apple to build its brand image, 2) To gain early experience in developing the ecosystem, and 3) To understand user behavior.”

Meta has already launched smart glasses in the form of the Ray-Ban Meta series. The Ray-Bans are primarily designed for capturing photos and videos, and also offer open-ear audio for listening to music and taking calls, but they lack a heads-up AR display.

The Hypernova glasses, on the other hand, are likely to feature a small monocular heads-up AR display for showing notifications and navigation, and interacting with Meta AI. They could also include a neural wristband to interpret subtle finger and wrist gestures for control.

From what we know, Meta’s new smart glasses look like an early but significant step toward fully immersive, AR-enabled, and AI-powered smart glasses.

Related Posts

Your Apple Watch could face a new US import ban over $634m fight

Reuters reports the ITC will examine whether Apple’s redesigned blood oxygen feature still infringes Masimo patents and aims to finish within six months.

Samsung’s Galaxy Buds 4 Pro want you to control audio with your head

Android Authority obtained the animations, which highlight a "Head Gestures" feature that lets the buds respond to simple movements, plus a flatter stem and redesigned charging case.

Your AirPods can act like AirPods again on Android, but there’s a catch

LibrePods is a free app by developer Kavish Devar that, according to its project page, brings "full AirPods functionality" to Android and Linux.