The race to put smart glasses on your face is getting hot. SNAP spectators are turning into “sunglasses” and will work in 2026 to wear light and more powerful AR.
The CEO Ivan Spiegel announces surprisingly all the new sunglasses on the stage of the XR event, promising smart glasses that are small, quite light and “with a ton of more capacity”.
The company did not spell a specific timeframe or price, but the launch schedule of 2026 has put the Meta on the notice, which is busy manufacturing its exciting orange AR glasses for 2027. This appears, Snap Springs will face glasses based on Samsung/Google Android XR, which is also expected in 2026.
You can like
As far as users can expect from sunglasses, Snap is making them on the same snap OS that is used in its fifth -generation spectators (and possibly still use a pair of Qualcomm Snapdragon XR chips). This means that all metaphors of interface and interaction, such as indicators, will remain. But there is a significant number of new features and integration that will begin to show this year, long before the spirits, including AI,
Upgrading the platform
(Image Credit: Lance Alanov / Future)
Spiegel earlier revealed to the latest disclosures that Snap began working on glasses before Snap Chat, and that was also one thing and that the company’s main purpose is “making computers more human.” “With progress in AI, computers are thinking and working more like humans than ever before,” he added.
Snap OS with these updates is to bring the AI platform into the real world. They are bringing Gemini and Open models to Snap OS, which means that some multi -model AI’s capabilities will soon be part of the fifth generation spectacle and, eventually, spectacles. These tools can be used for fly text translation and currency conversion.
The latest platform also includes the tools tools of SNAP lens builders that will connect with spectators and sunglasses’ ARVoform -based display capabilities.
For example, a new SNAP3D API will allow developers to use Genai to make 3D items in the lens.
Updates will include a depth module AI, which can read 2D information to make a 3D map that will help anchor the virtual object in the 3D world.
Spenders (and finally sunglasses) businesses can appreciate the new flat management app, which says the developers at the same time manage and monitor and monitor and deploy sunglasses for the direction navigation at the same time, a museum.
Later, the SNAP OS will add web XR support to make AR and VR experiences to the web browsers.
Let’s make it interesting
Spiegel claims that, through lenses in Snapchat, Snap has the world’s largest AR platform. “People use our AR lens 8 billion times a day in our camera.”
It’s a lot, but it is practically through smartphones. Currently, only the developers are using the big spectacle and their lensable abilities.
The release of sunglasses can change. When I tried the spectacle last year, I was impressed with the experience and found them, while Meta Orion was not as good as glasses (the lack of tracking of the eyes was standing for me), full of ability.
A mild form element that I found with Orion or goes beyond and some Samsung is seen in Android XR glasses, it can allow the sunglasses in the edge of AR glasses. That is, they do not cost $ 2000.


