Google Google is expanding the AI format to India, which is the first international extension of the feature. Mountain View -based Tech Dev first unveiled artificial intelligence (AI) search experience in March to select consumers in the United States, and it was extended to all US -based users in Google I/O. This version of the AI format lacks some features, such as bilateral real -time voice conversation experience, dubbed directly. However, Indian users will use the integrated feature in Google Lens and create image -based questions.
India becomes the first international expansion of AI format in search
In a blog post, Tech Dev announced the role -out of the AI feature. Currently, it is available as part of the search labs, which hosts experimental features for Google Search. Once users enroll for it, they can turn on AI mode from the website. After activating, users can find a new AI mode below the search box on the results page. Now it is part of different parts, and all and before the news parts, the left is placed on the left.
Google highlights that AI mode is powered by the customs version of the Gemini 2.5 large language model (LLM), and it comes with the ability to think (reasoning). This means that this feature will automatically use more computers to find answers to complex questions. In particular, this feature uses the company’s inquiry fan out technique, which facilitates AI mode to break the questions in subtipes and each of them simultaneously running together.
AI mode is currently available on desktops, as well as the Google app, as well as the Google app. In addition, users can also click on the image of an item through Google Lens and then redirect this image through AI mode to get a more comprehensive answer. The search tool also supports sound as input, and users can speak out loud rather than type their questions.
Separately, the 9TO5 Google report states that the US version of this feature is now being upgraded. Google allegedly suggests AI mode users to indicate based on their search date. This means that consumers will now see immediate suggestions on the basis of the titles they have recently searched, which will make these gestures more related. This gesture is fresh every time the user goes to the “Mate AI mode” page through a Google app, search widget, or a pixel launcher shortcut.


