
AI end-side efforts, Google partners with XREAL to launch Project Aura smart glasses next year

Project Aura deeply integrates Gemini AI, supporting real-time translation, visual search, and other functions, using the Qualcomm Snapdragon XR 2 chip, equipped with three cameras and a Micro OLED display. Analysts believe that the combination of edge AI and XR devices is expected to drive the industry into a concentrated outbreak period
Google is launching a new round of attacks in the smart glasses field through the deep integration of the Android XR platform and Gemini AI.
On December 9th, Beijing time, Google officially released the Android XR platform, marking a new phase of "AI + spatial computing" in the AR industry.
At the launch event, Google showcased Project Aura, developed in collaboration with Chinese company XREAL, which is the world's first consumer-grade AR glasses product equipped with the Android XR platform, planned for release in 2026, with a price expected to be lower than Apple's Vision Pro and Samsung's Galaxy XR headset.
Project Aura marks an important layout for Google in the AI edge computing field. By integrating Gemini AI into the Android XR platform, users can achieve context-aware conversations, device control, real-time translation, and visual search functionalities.
Analysts believe that the combination of edge AI and XR devices is expected to drive the industry into a concentrated explosion period.
One of the edge strategy products, Project Aura
Project Aura is one of the strategic core products of Google's Android XR platform, with hardware development handled by Xreal.
This device uses Xreal's prism lens technology and resembles thick sunglasses. The product can operate without a smartphone but requires an external battery.

The device is equipped with the Qualcomm Snapdragon XR 2 Gen 2 Plus chipset, the same processor used in the Samsung Galaxy XR headset. This means Project Aura can run almost all Android XR applications supported by Galaxy XR.
During the demonstration, testers successfully ran the VR game Demeo and controlled it through gesture tracking, with game cards even "growing" from their hands.
Project Aura is equipped with three cameras, supporting full-room tracking and gesture recognition, and can capture photos and videos for Gemini AI recognition.
The display system uses Micro OLED technology, with resolution and size exceeding Xreal's existing product line. The 70-degree field of view, though smaller than traditional VR headsets, is sufficient to provide an immersive experience.
Xreal CEO Chi Xu stated that Project Aura is the "stepping stone" to the future of wireless glasses. He said:
We are not trying to solve the problem of all-day wear, but you will find it completely usable for several hours.
He revealed that Google is developing independent all-day glasses, but Project Aura must first refine the experience of working in conjunction with smartphones.
Gemini AI builds the intelligent core at the edge
Google's differentiation advantage lies in the deep integration of Gemini AI into the Android XR platform.
In the demonstration of Project Aura, users can use Google's Circle to Search feature to circle a lamp in the room with their finger in the air, instantly receiving purchasing information and search results. This feature was previously implemented on the Galaxy XR headset and is now extended to glasses form.
Project Aura also supports new features for wireless connection to PCs.
Through the PC Connect application, users can project their Windows screen in front of them and use gesture tracking for click operations—a feature that Apple’s Vision Pro still cannot achieve when connected to a Mac. Users can also open multiple Android XR application windows simultaneously, such as YouTube videos.
This edge AI capability is key to Google's bet.
According to Juston Payne, Google's Director of XR Product Management, on Android phones, Gemini can access the complete operating system, achieving deeper integration than the iPhone, similar to how Pixel earbuds and Android watches operate.
Industry analysts believe that the launch of Google's Android XR platform is expected to drive the AR industry into a phase of "ecological standardization" development. By attracting hardware partners and developers through an open platform, Google aims to replicate Android's successful path in the smartphone market within the smart glasses sector.
Fashionable Smart Glasses: Targeting Meta Ray-Ban
In addition to Project Aura, Google also showcased fashionable smart glasses developed in collaboration with Warby Parker and Gentle Monster, set to be launched in phases starting in 2026. These glasses are positioned for everyday wear, directly competing with the products resulting from Meta's collaboration with Ray-Ban.
These glasses are available in two versions: with and without a display.
The display version uses Micro LED color display technology manufactured by Raxium, capable of showing rich notifications and application information pushed from Android phones.
In the demonstration, testers received navigation guidance through Google Maps, seeing turn prompts when looking up and a map that rotates with their head when looking down. The Uber demonstration showed that users could see ride information in the glasses and look down to check the walking route to the pickup point.
The glasses also support YouTube video playback and Google Meet video calls. Although the display window is small and semi-transparent, it is sufficient for quickly viewing social videos or receiving prompts when looking up.
Google promises that these glasses will support a wide range of myopia degrees, exceeding the limited degree support of Meta's Display glasses launched this fall. Juston Payne emphasized:
To put it simply, we take the fact that these are primarily glasses very seriously For the version without a display, users can view photos on their wrist after taking them. When conversing with Gemini, visual responses can also be seen on the watch screen
