tl; drag
- Google Search’s AI review first gave confidence but wrong answers for a simple search inquiry “Is this 2025?” Now it is fixed, and AI’s review responds correctly.
- More importantly, Google no longer hides withdrawing about potential mistakes in AI’s review response, which helps users remind us of doubling the results.
- Cleaner, to close the AI review results for more traditional Google Search experience, can continue to add “-Ai” to search.
Update, May 30, 2025 (11:49 AM et): Google spokesperson shared the following statement with us:
Like all the features of the search, we strictly improve our system to update and use such examples. The majority of the AI review provides helpful, factual information and we are actively working on a refreshing to solve this type of problem.
As mentioned in our article below, Google Search now identifies the current year, even in the results of the AI review.
The original article, May 30, 2025 (05:23 AM et): Two days ago, we Android Authority Were the first embarrassing AI reviews to report about Gaif where Google search would wrongly give the wrong answer to a simple question with confidence “Is this 2025?” At the time of reporting, we tried many times to get the right answer, but Google search will fail differently, but it will still fail. Thankfully, it seems that Google has now set the answer, as the AI review now corrects that it is really 2025.
Aamir Siddiqui / Android Authority
We reached Google, but the company has not yet responded with a statement or comment. However, Google search eventually returns the correct answer, though the source offered for this answer is changing.
Further, withdrawing text “AI’s reaction may include errors” look fine in the answer piece, which was previously printed behind the “show more” tag. This is important because it is highlighted that consumers should not fully rely on AI infield reactions and they should ideally check the AI review answer.
Most of us clearly know the answer to such an easy search question, but this is an excellent example of showing that we should not rely on AI’s review with blind faith, especially on more complicated questions where we cannot distinguish the right right. AI’s review has previously been seen giving people confident but wrong answers to the oppressors of oppression, so this distrust has been guaranteed.
If you are disappointed with the lack of reliable information in such search results, you may consider closing the AI review for cleaner, more traditional Google Search experience. If you want to do it on the same question, you can add to the AI review answer for this question in your own search terms.
Have a tip? Talk to us! Email our staff at News@Androidauthority.com. You can remain anonymous or get the credit for information, it’s your choice.


