Google’s direct feature with Voice input is now available in the United States for iOS and Android on its app. To access it, you will have to choose in the AI format in the labs, but after you do, you will be able to communicate with the search behind and ahead. To use the feature, open the Google app and tap the new direct icon to ask a question orally. Google will also answer with AI-Infield audio, and you will be able to ask questions naturally. Under the hood, the feature is developed by Gemini’s customs version, with high sound capabilities.
The company says that when you go on walking or multi -tasking, such as packing bags. In the example used by Google, the user asked how to protect Lenin’s clothes from the wrinkles in the suit case and Gemini responded orally. The user then asked some questions about the search directly without exiting the search screen or re -tapping the icon directly. Even if you open another app, you can continue your chat, and you can also see a transcript of the Google response and then type more questions if you want to shift in a written conversation.
Although this feature has the ability to form an easy tool, the sources from which Google takes its information will probably not be able to see any traffic from the conversation. The Google search shows links directly from across the web on the screen, but they are shown in small cards and can be completely ignored if the user is really talking to Gemini.
In the coming months, the company is expanding the search with the ability to find and what you are looking for in real time with the ability to find about it. Google announced this special ability in I/O this year. For example, Google said that you can show the search a difficult problem of mathematics and ask you to help solve it or to explain a concept that you are having to grip.


