The biggest revelation of Google I/O was that the company is formally returning with its prototype XR smart glasses in a mixed reality game. It’s been years when we have seen a lot from search giant on the AR/VR/XR Front, but with the help of hardware partners, it seems that it is finally changing.
After a key note, Google showed a very short demo of the prototype device that we saw on the stage. I just got a few minutes with the device so unfortunately my comments are very limited, but I immediately. It was affected how the glass was compared to the Meta’s Orion prototype and the increased truth of Snap. Although both of them are quite complex, Google’s prototype device was lightweight and felt like a normal pair of glasses. The frame was a bit thick than what I usually wear, but not fully.
Krisa Bell for the Enlegates
At the same time, there are some notable differences between Google’s XR glasses and what we have seen from Meta and Snap. Google’s device has only one side display – right lenses, you can see it in the icon in the upper part of this article – so the visuals are far more “shiny” than fully degraded. I noted during the Google Demo on Stage in I/O that the field of view looks tight and I can confirm that it also feels far more limited than the Snap 46 -degree viewing field (Google refuses to distribute details on what its protrudes are wide).
Instead, the display felt a little bit about how you can use the front display in front of a foldable phone. You can take a brief look at the small pieces of time and notifications and information from your apps, as if you are listening to the music.
Obviously, Gemini’s purpose is to play an important role in the Android XR ecosystem, and Google drove me through a few demo of an assistant working on smart glasses. I can see books on the wall or see some art and ask Gemini what I am seeing. It felt like a multi -model capabilities we have seen and seen somewhere else.
There were some insects, though, even caution in the Orchestated Demo. Gemini began to tell me what I had already seen before I finished my question, after which there was a strange moment where we both stopped and obstructed each other.
One of the interesting use of Google was showing Google Maps in glasses. You can get the head of your next turn, such as Google, like the direction of walking, and down to see a small portion of the map on the floor. However, when I asked Gemini how long it would take to go from my place to San Francisco, he was unable to provide an answer. (He said something like a “toll output”, and my demo ended very quickly.)
As I have seen, like many other mixed reality demo, there are still very early days. Google was cautious to emphasize that it is a prototype hardware, which means what Android XR is worthy, not a device in which it is planning to sell at any time. So any final smart glasses that we meet with Google or the company’s hardware partners can look very different. Although the Android XR was able to show me a few minutes, however, Google is thinking about bringing AI and mixed reality together. This is not so different from the meta, which also sees smart glasses as a key to adopting its AI assistant as long -term. But now that Gemini is coming about every Google Product that exists, the company has a very solid foundation for fulfilling it.
Developing ..


