More than ten years have passed since the original Google Glass and now Google is officially giving smart glasses another try. This announcement was made during The Android Show: XR Edition, which signals Google’s renewed push into the AI wearables with competitors like Apple and Meta.

Google has officially revealed its Project Aura, which is going to launch a set of wired XR glasses with the help of smart-glasses startup XREAL in 2026. Along with this, two new models of the Google AI glasses are in the pipeline for 2026.
Development of Two Models
Google is taking a two-part approach to ensure that there is a model for everyone. They have even officially confirmed that they are working on two different types of smart glasses:
- Screen-Free Assistance Model: This is a version that is designed to look and feel like a standard pair of glasses. It comes with no display, but it is packed with microphones, speakers and cameras. It totally relies on voice interaction and it is built for users who want Gemini AI assistance and surroundings awareness without a screen getting in the way.
- Monocular XR Glasses: These are more advanced, as they feature an in-lens display which is capable of showing visual overlays such as private turn-by-turn navigation or a real-time translation caption, directly in your line of sight.
- Project Aura: Finally, Google is working on “Project Aura,” which aims to offer hands-free interaction without the bulky XR headsets that we are used to seeing. This initiative marks Google’s entry into the smart glasses market with XREAL, a smart-glasses startup based out of China. These glasses are set to merge VR capabilities in a smart glass form factor for everyday use.
Powered by Gemini AI and Android XR
Both of these models run on Android XR and use the same Qualcomm Snapdragon XR2 Plus Gen 2 chipset, which is Google’s new operating system specially designed for next-generation extended reality (XR) devices like headsets and smart glasses.
It ensures a seamless cross-device support and app integration with the broader Android ecosystem.
The core experience is Gemini AI. Unlike the previous voice assistants, Gemini is designed to provide multimodal help, which means that the glasses start to see and hear the same as we do. This will allow the glasses to interpret your surroundings and offer contextual help, like identifying a landmark you are looking at, or summarizing a document just by sitting at your desk.
Partnership with Samsung and Fashion Brands
To avoid the techy look of the Google Glass, Google is focusing heavily on design and fashion.
Google is utilizing its long-standing partnership with Samsung to support the hardware and the XR ecosystem.
These glasses are being co-developed with fashion eyewear brands Gentle Monster and Warby Parker. Google has a clear strategy to make smart glasses that people actually want to wear as everyday accessories, not just as gadgets.
Expected Features and Early Capabilities
With a 70-degree plus field of view, these tethered glasses are designed for “practical everyday uses” like “following a floating recipe video while you cook or seeing step-by-step visual guides anchored to an appliance you are fixing”.
This high-end headset looks like it might be the closest Android-based equivalent to an Apple Vision Pro. It can hear or see translations of conversations that are happening in front of you. It takes the help of visuals or audio to guide you to your destination. You can even ask questions about what you are looking at (e.g., “What kind of flower is this?”).
While the features of the other two variants are not disclosed, Project Aura gives us a good look into what can be expected out of the Google Smart Glass lineup in 2026.
Use Cases
Google is aiming for practical utility over features. Here are some of the ways I feel we can use these Google glasses in our day-to-day lives:
- It can help you navigate as it comes with in-view AR navigation. Whether you need airport guidance or object identification while sightseeing, the glasses keep your hands free.
- This can be used for work too, as these glasses aim to be a tool for getting things done. It features floating notes, which can help technicians, creators, or even chefs to create to-do lists, step-by-step repair, or even installation instructions without even stopping their workflow.
- It can be used in education and learning as you just need to ask “What am I looking at?” and it will give you a live explanation of tools, plants, or even landmarks by effectively creating an AI-powered experience.
- These glasses can be used by people who are blind or have low vision, as they use the Screen-Free Assistance Model, which relies on audio cues and a camera with Gemini AI. It can function as a companion to them.
Launch Timeline
Google has officially confirmed that its Project Aura will be released in 2026, without disclosing the exact date.
As CES 2026 is going to be held in January 2026, in Las Vegas, Nevada, there is a high chance that Google can unveil its glasses there. Samsung, being a key partner in the XR ecosystem, has used this stage for major announcements in the past.
Final Verdict
With two distinct models, an expanding XR ecosystem, and deep AI integration, Google’s 2026 smart glasses could mark the company’s most serious attempt yet to make wearable AI mainstream.
By partnering with established eyewear brands, they are finally addressing the biggest hurdle of all: to make the smart glasses look good.