Teased at Google I/O, Project Aura is a collaboration between Jarrell and Google. This is the second Android XR device (the first being Samsung’s Galaxy XR headset) and is expected to launch in 2026. Taking this further, I find that the term “smart glasses” doesn’t quite fit.
Is it a headset? Smart glasses? Both? These were the questions I asked when I got my hands on Project Aura at a recent demo. This saw Like a pair of chunky sunglasses, except the cord on the left side leads down to the battery pack that also serves as the trackpad. When I asked, Google reps told me they considered it a headset masquerading as Glass. There’s a term for them, too: wired XR glasses.
I can connect to the laptop wirelessly and create a large virtual desktop in my space. I have a 70 degree field of view. My first task is to launch Lightroom on the virtual desktop while opening YouTube in another window. I play a 3D tabletop game where I can pinch and drag the board to zoom in and out. I see a painting on the wall and summon the circle to find it. Gemini tells me the name of the artwork and the artist.
I’ve done it all before in the Vision Pro and the Galaxy XR. This time, my head is not Stuffed in a huge headset. If I wore it in public, most people wouldn’t notice. But this is not augmented reality, which controls digital information over the real world. It’s similar to using the Galaxy XR, where you see apps in front of you and around you.
A Google representative told me all about how Project Ori was originally developed for the Galaxy XR. No apps, features, or experiences had to be redesigned for Project Aura’s form factor. It is huge.
XR has a major app problem. Take the Meta Ray-Ban Display and Vision Pro. Both launched with few third-party apps, giving users little reason to wear them. Developers also have to choose which of these gadget apps they will invest in creating. This leaves little room for smaller companies to compete or experiment with big ideas.
That’s what makes Android XR interesting. Smaller players, such as Zerial, can access apps developed for Samsung’s headsets. Android apps will also work on AI glasses launching next year from Warby Parker and Soft Monsters.
“I think it’s probably the best thing for all developers. You don’t see any pieces anymore. And I’m sure there will be more and more devices coming together. That’s the whole point of Android XR,” says Zariel CEO Chi Su.
Slipping on Google’s latest prototype AI glasses, I’m treated to an Uber demo in which a fictional version of me takes a ride from JFK Airport. A representative calls an Uber over the phone. I see an Uber widget pop up on the glass display. It shows the estimated time and my driver’s license plate and car model. If I look down, a map of the airport appears with real-time directions to the pickup zone.
All of this is powered by Uber’s Android app. Meaning Uber didn’t have to code the Android XR app from scratch. In theory, users could just connect the glasses and start using the apps they already have.
When I ask Gemini to play some music, the YouTube Music widget pops up, displaying a funky jazz mix title and media controls. It’s also only using the YouTube Music app on Android phones.
I have been asked to ask Gemini to take a picture with the glasses. A preview of it appears in the display And On a pair of Pixel watches. The idea is that integrating smartwatches gives users more options. Say someone wants audio-only glasses with a camera. They can now take a photo and see how it looks on the wrist. It will work on any compatible OS watch.

I also try direct translations where the glass detects the language the speaker is speaking. I make a Google Meet video call. I get the Nano Banana Pro to add pop elements to another photo I took. I try a second prototype with a display in both lenses, which enables a larger field of view. (These are No Coming out next year.) I watch a 3D YouTube video.
It’s all impressive. I’ve heard some spiel that Gemini really is the killer app. But my jaw really drops when I’m told that next year’s Android XR glasses will support iOS.
“The goal is to give as many people as possible the multimodal Gemini in their glasses. If you’re an iPhone user and you have the Gemini app on your phone, you’ll get the full Gemini experience right there,” says Justin Payne, Google’s director of product management for XR.
Payne notes that this will be broadly true of Google’s iOS apps, such as Google Maps and YouTube Music. Limitations on iOS will include most third-party apps. But even so, Payne says the Android XR team is looking for work. At a time when wearable ecosystem lock-in is at an all-time high, this is a breath of fresh air.
Google’s use of its existing Android ecosystem is a surprising move that could give Android XR an edge over the Meta, which currently goes into hardware but only has only Opened its API for developers. It also puts pressure on Apple, which has lagged behind on both the AI ​​and Glass fronts. Making things interoperable between device form factors? Frankly, it’s the only way a bait-wave device like Project Aura has a shot.
“I know we can make these glasses smaller and smaller in the future, but we don’t have that ecosystem,” added Xu, Zariel’s CEO. “There are only two companies in the world right now that can really have an ecosystem: Apple and Google. Apple, they won’t work with others. Google is the only option for us.”
Google is trying to avoid past mistakes. It intentionally partners with other companies to make hardware. It’s a clean cut from the original Google Glass design. It has apps pre-launch. The prototypes explore multiple form factors—audio only and display in one or both lenses.
Payne doesn’t dodge when I ask the big cultural question: How do you discourage glass holes?
Payne says, “If anything is being recorded, there’s a very bright, pulsing light. So if the sensor is on to save anything, it’ll tell everyone around. That includes questions for Gemini. anyone Functions included in the camera. The on and off switches will have clear red and green markings so that users can prove to others that they are not lying when they say the glasses are not recording. Android and Gemini’s existing permissions frameworks, privacy policies, encryption, data retention and security guarantees will also apply, Payne says.
“There will be a whole process to avoid certain things so we can avoid some of the things that might happen if someone decides to misuse the camera,” Payne says.
On paper, Google is making smart moves that address many of the challenges inherent in this space. It sounds good, but that’s easier said than done before these glasses launch. A lot can change between now and then.


