Reports indicate that Google is accelerating development of an all-new AI glasses initiative and has already assembled a full supply-chain ecosystem. Hardware manufacturing will reportedly be handled by Foxconn, the reference design draws from Samsung, and the core chipset is based on a Qualcomm architecture.
However, this project is not directly related to the “Project Aura” smart glasses that Google showcased with Xreal at Google I/O 2025. The two efforts are advancing in parallel as independent initiatives. This new program was established late last year and has now entered the proof-of-concept and small-batch production testing stages. If progress continues smoothly, the earliest launch window is projected for the fourth quarter of 2026.
From a design standpoint, Google’s next-generation AI glasses are expected to adopt a waveguide optical system and include an integrated camera to support advanced visual-AI capabilities.
One of the project leads is rumored to be Michael Klug — formerly a core member of AR unicorn Magic Leap and now Platform Engineering Lead at Google Labs — a detail that fuels speculation about the optical sophistication of Google’s upcoming device.
Foxconn, meanwhile, announced late last year a partnership with UK-based semiconductor innovator Porotech to advance Micro LED technologies for next-generation augmented-reality applications. As a result, Google’s AI glasses are expected to incorporate Porotech’s display technologies.
Google’s exploration of smart eyewear dates back to the debut of Google Glass in 2012. Though privacy concerns forced a transition from the consumer market to enterprise and industrial uses, the company never paused its core research. Today, Google is collaborating with Samsung and Qualcomm on the Android XR operating system, tightly integrating it with the Google Play Store to ensure a robust content ecosystem.
Its software advantage comes from the Gemini family of models. Last year’s demonstration of the “Project Astra” AI agent — capable of visual reasoning, memory, and dialog — showcased these abilities running on a smart-glasses form factor, highlighting the long-term potential of merging eyewear with ambient AI.
Though Google trails Meta’s Ray-Ban line in the consumer arena, the company’s renewed push is backed by a mature Android XR ecosystem, powerful Gemini AI capabilities, and years of hardware experience — making its return to AI eyewear look decidedly strategic.
Google is far from alone. Meta continues to strengthen its own smart-glasses lineup, while Alibaba and Xiaomi have recently introduced AI-enhanced eyewear of their own. Samsung, Apple, and others are also rumored to be preparing smart-glasses launches for 2026. Together, these efforts are poised to draw more brands into the space and accelerate the rise of “everyday ambient AI” through wearable devices.