Inside Google’s Project Aura: Specs, Use Cases & How It Works with Android XR

Google recently announced its long-awaited return to consumer AR. Set to launch in 2026, Project Aura is the next-generation XR glasses developed by Google in partnership with XREAL.

Google Project Aura; next wearables release

Google’s Android XR and Gemini AI, along with XREAL’s optical expertise and X1S chip, are set to make Project Aura the most ambitious attempt to shrink the huge VR immersion into a standard glasses shape.

It is a massive shift in how Google imagines spatial computing, mobility and hands-free AI. Here’s everything we know about Aura’s specifications, who it might be for and how it will work with Android XR.

What Is Project Aura?

Unlike the Galaxy XR headset, Project Aura is Google’s first flagship Android XR glasses created with XREAL. They are designed to deliver full capital apps, room-scale tracking, hand tracking input, PC-casting and on-device Gemini AI. 

All this in a smartglass form factor tethered to a small compute puck. They are portable and easy to wear for hours on end. They are the first smart glasses to run full Android XR apps and they bring spatial computing into an everyday form.

Specifications: What We Know So Far

SpecificationsProject Aura
DisplayMicro-OLED, high brightness, high resolution
Field of View70 degrees + optical FOV
ComputeSnapdragon XR2 Gen 2 Plus (in puck) + XREAL X1S (in glasses)
Tracking3-camera array: room tracking + hand tracking
InputHand gestures, touchpad puck, voice (Gemini)
ConnectivityUSB-C tether + wireless PC-connect
PlatformAndroid XR (full app support)
AIGemini integration, AR Circle to Search, object recognition
CompatibilityWorks as display glasses with phones, tablets and PCs

Peak Innovation: XREAL Hardware + Google Intelligence

Project Aura 70 FOV; Inside Google’s Project Aura: Specs, Use Cases & How It Works with Android XR
Image Courtesy: XREAL

Project Aura combines two pillars- the XREAL hardware and Google intelligence. The 70 degree plus optical field of view is the largest XREAL has ever shipped. It also has a custom XREAL X1S special computing chip for camera fusion, hand tracking and low-latency rendering. 

A three-camera array gives it a special understanding and helps in gesture tracking. Aura also has a micro-OLED display with higher brightness and resolution than any of the previous XREAL models. It also displays three windows (possibly more) that can be used for multitasking.

Google’s Gemini allows accessibility for contextual commands and visual understanding. There is also a hand pinching gesture that triggers a Circle to Search in AR. Android XR enables native special apps and not just widgets or overlays. 

Together, these allow Aura to blend your physical surroundings with floating digital windows. Envision recipes in your kitchen, emails over your desk, or a movie screen above your bed.

How Project Aura Works

Aura uses a split design with the Glasses and a Puck. 

The glasses contain sensors, a display and the X1S chip. They are also ultra-lightweight for comfort and support prescription inserts, just like other XREAL glasses. They also provide spatial mapping, hand tracking and AR overlays.

The Puck uses the main Snapdragon XR 2 Gen 2 Plus processor, which is the same as the one in the Galaxy XR. It includes battery storage and network connectivity. It tethers via a thin cable for low latency and acts as a touchpad or pointing device.

This design keeps the glasses light while giving them desktop-class power. However, putting a VR experience into a smart glasses form factor with a small puck does not look promising to me on the battery front. But let’s hope that’s not the case.

Real-World Use Cases: Where Aura Shines

  • Special productivity

It casts your Windows PC wirelessly (or wired) and also arranges multiple floating screens. You can use hand gestures as a pointer to open apps, resize windows, or even type. This will not only increase productivity but also be a great experience.

  • Hands-free AI assistant

With the Aura, you can circle a real-world object with your fingers to trigger a visual search using Google’s Gemini. You can also ask Gemini to summarise documents, pull up maps, or guide you through tasks. Aura can also be used alongside your phone or a watch for seamless continuity.

  • Gaming and immersive entertainment

Imagine watching movies on a virtual theatre-sized display or playing room-scale AR games like Demeo. This will be possible with the Aura glasses, alongside using hand tracking for interaction without a controller.

  • On-the-go device

While Aura isn’t meant to replace and all-day eyewear yet, it is designed to fold away like regular glasses. They can be popped out for a quick work session on flights, cafes or hotel rooms. The best part of this device is that it can replace a bulky VR headset with something that is portable and usable for an everyday user.

How Project Aura Integrates with Android XR

Aura can run Android XR’s native special apps such as Google Maps, Google Meet, YouTube, etc. It can also run Gemini apps and 3D spatial productivity apps.

Aura also has cross-device compatibility because it integrates deeply with Android phones, Wear OS watches (for gesture control, notifications and camera previews) and with PC via wireless casting.

This project also aims to put AI everywhere. Google’s Gemini helps with voice commands, visual recognition, AR search, contextual overlays, and multimodal reasoning.

Transparency, Optics and Gesture Control

Google claims “optical transparency”. But, Aura is not fully transparent like normal sunglasses and it uses top-down projection via waveguides, which is similar to XREAL One Pro. Therefore, we can expect a better clarity in brightness than previous XREAL models, but not as clear as optical-only see-through lenses like the Meta Orion prototype.

Project Aura also has XREAL’s three-camera array, which enables full hand tracking, room mapping, Circle to Search gesture and virtual button presses and manual interaction. While Google has not indicated eye tracking support yet for the classes, with a project this big, it is not a far cry.

Compatibility: Can the Aura Puck Work with XREAL One / One Pro?

One of the biggest questions I have for Project Aura is whether the Aura Purk works with XREAL One or One Pro for those of us who have these glasses already?

While Google and XREAL have not confirmed cross-generation compatibility, what is technically possible is the puck to provide compute, battery, touchpad input and Android OS support. While the One and One Pro already work as a display paired with a PC, and also lack the additional cameras necessary for complete XR tracking, the puck could power them as a monitor, essentially a mini Android console.

However, they cannot perform fully like the Aura because these models lack camera sensors. While the XREAL Air 2 Ultra has two cameras just like the Aura, it does not have the native 3DoF. So, will the Puck be sold separately and be compatible with other XREAL devices? We don’t know yet, but we sure hope they have a way around it.

With XREAL One Pro glasses being a fairly recent launch, they should offer a trade-in to accelerate sales and adoption for Project Aura.

The Launch in 2026

Project Aura is set to launch in 2026, but the date isn’t revealed yet.

For developers, tools are available now and development kits will ship next year. Aura is set to become the reference device for spatial apps and Android XR is ready for mainstream use.

Wrap Up

If Google succeeds, Aura may just be able to bridge the gap between bulky headsets and all-day smart glasses. 

Aura is the first real attempt at AR that can actually be functional and portable for daily use. With a large 70 degrees plus field of view, Gemini integration, hand tracking, Android XR and PC casting, it is definitely one of the most sought-after wearables in 2026!

FAQs

Leave a Comment

Your email address will not be published. Required fields are marked *