AI smartglasses shipments reached 8.7 million units in 2025 — a 322% year-over-year increase, according to market research firm Omdia. The device leading that surge isn’t from Meta or Snap. It’s the Even Realities G2, a $600 Tencent-backed smartglass that ships without a camera.
That’s not a missing feature. It’s the product.
The Numbers Behind the 322% Surge
Omdia’s 2025 wearables report places AI smartglasses as the fastest-growing segment in consumer electronics by growth rate, outpacing both smartwatches and wireless earbuds. The 8.7 million unit figure represents a category that barely registered at scale in 2024, when total shipments sat near 2 million units.
The catalyst isn’t hardware novelty — it’s the maturation of on-device AI. Specifically, voice processing and real-time inference running on sub-5-watt chipsets. Qualcomm’s AR1 Gen 1 platform, deployed across multiple devices in the category, handles wake-word detection and local inference with latency below 300 milliseconds for most queries.
At a $600 average selling price across the Even Realities line, the addressable revenue attached to 8.7 million units exceeds $5 billion in category value — though Omdia hasn’t released per-SKU revenue breakdowns.
Even Realities G2: The 0 Device Built Around What It Lacks
Even Realities, headquartered in Shenzhen with Tencent among its backers, launched the G2 in late 2024 and scaled through 2025 on a deliberately minimal spec sheet. The device includes:
- Dual directional microphones with active noise cancellation
- A monochrome micro-LED display projected into the right lens at approximately 640×200 pixels
- Bluetooth 5.3 and a companion iOS/Android app
- No camera, no lidar, no depth sensors
The display renders text overlays — translated speech, meeting transcriptions, navigation prompts, AI assistant responses — directly in the wearer’s field of view. This isn’t immersive augmented reality. It’s a heads-up display for ambient intelligence, and the distinction matters for battery and weight.
The G2 weighs 38 grams — lighter than most prescription eyewear — and delivers 8 hours of active-use battery life. The AI backend processes real-time translation across 40 languages and live transcription through a cloud connection, with offline wake-word detection running locally.
What AI Smartglasses Actually Do Without a Camera
Camera-free AI smartglasses operate in an entirely audio-first paradigm: translation, transcription, voice queries, calendar management, and ambient sound classification. These aren’t degraded versions of camera-enabled features — they solve a different set of daily friction points.
Live translation at 40 languages with sub-500ms latency is more immediately useful in a business meeting than a camera feed. Real-time transcription overlaid in your field of view during a lecture outperforms photographing a whiteboard. The G2’s design posits that most users’ daily AI needs are conversational, not visual.
Even Realities’ internal data, shared with Omdia as part of the market report, shows 73% of active G2 users cite translation as their primary use case — above assistant queries (18%) and navigation prompts (9%). That distribution validates the no-camera thesis on its own terms.
Meta Ray-Bans vs. Snap Spectacles vs. Even Realities G2
| Device | Price | Camera | Display | Primary AI Feature | Battery |
|---|---|---|---|---|---|
| Even Realities G2 | $600 | None | Micro-LED HUD | Translation, transcription | 8 hours |
| Meta Ray-Ban Smart Glasses (Gen 2) | $299–$379 | 12MP | None | Visual AI, Meta AI assistant | 4 hours |
| Snap Spectacles (5th Gen) | $99/month | Dual camera | Full AR waveguide | AR overlays, Lens Studio | 45 minutes |
Meta’s Ray-Ban Smart Glasses have moved over 1 million units since their 2023 launch, but face a persistent ceiling: the 12-megapixel camera draws power aggressively, and Meta AI’s visual query pipeline requires cloud round-trips that add meaningful latency. The Ray-Bans are cheaper and carry a recognizable brand — but they are not driving the 322% growth story.
Snap Spectacles remain a developer-tier product. At $99 per month, they target enterprise AR and creator tools rather than mainstream daily carry. A 45-minute active battery life disqualifies the device from most real-world workflows before the conversation about features even begins.
Why the Privacy-First Approach Is Winning
The privacy argument has graduated from a marketing talking point to a measurable purchase driver. A 2025 Pew Research Center survey found 61% of U.S. adults said they would be more likely to adopt a wearable AI device with no camera. That figure climbs to 74% among adults aged 35–54 — the demographic with the spending power to buy a $600 peripheral.
Enterprise adoption compounds this effect. Introducing Meta Ray-Bans into a law firm, hospital, or financial institution requires IT policy exceptions, legal review, and often explicit employee consent protocols for recording. The Even Realities G2 requires none of that. At organizational scale, friction differentials compound into purchasing decisions.
The broader consumer AI shift toward ambient, low-friction intelligence — visible across product categories from AI-native weather applications to autonomous discovery agents — points in the same direction: intelligence that operates quietly in the background without demanding attention or raising surveillance concerns.
The growing consumer traction of the Humans First movement reflects real, documented anxiety about AI-embedded cameras in everyday objects. Even Realities is — perhaps unintentionally — selling directly into that sentiment with a $600 device that asks nothing of its environment.
What the 322% Figure Doesn’t Capture
Omdia’s shipment data counts units sold into retail channels, not active daily-use devices. Return rates for first-generation AI wearables have historically been high — Snap’s original 2016 Spectacles saw approximately 50% return rates in the months following launch. Even Realities has not published equivalent return data for the G2.
The category also remains structurally dependent on smartphone tethering. The G2 requires a live Bluetooth connection to process most AI queries through its cloud backend. Fully standalone AI glasses — running on-device large language models without cloud dependency — don’t yet exist at consumer price points. Qualcomm, MediaTek, and Apple’s silicon division are all working toward that threshold, but no credible launch timeline has been announced below $1,000.
MegaOne AI tracks 139+ AI tools across 17 categories, and wearable AI is currently the fastest-moving segment by new product launches per quarter. The G2 won’t hold its position unchallenged. Meta’s next-generation Ray-Bans are widely expected to incorporate a display module, and the rapidly advancing audio AI stack continues lowering the cost of the voice-interface infrastructure that makes camera-free glasses viable in the first place.
The Market Signal That Actually Matters
The G2’s success reframes the product category. AI smartglasses grew 322% in 2025 not because consumers suddenly wanted more sensors on their faces — but because even minimal AI utility, delivered hands-free with all-day battery life, is genuinely useful. Even Realities demonstrated that removing the camera isn’t a compromise position. For the majority of users, it’s the reason to buy.
The next inflection point in this market won’t be defined by who adds the most sensors. It will be defined by who makes the experience least intrusive — and that race has already started with a device that ships with deliberate, profitable restraint.