The only mention of Elon Musk’s name (some) is enough to provoke people (a lot) and now, there is another reason for them to start a fire.
A recent report revealed that the performance of the Elon Musk Department (DOGE) team (DOGE) is increasing the use of its AI Chatboat, Grook in US federal agencies. Critics are raising concerns over privacy, morality and potential legal violations.
According to a number of sources familiar with the matter, who spoke to Reuters, a generating AI tool – a generating AI tool – developed by Grook -Musk’s company Xi is being used to analyze official data, including sensitive information. This development has criticized experts who warn that such use can violate the laws of conflict interests and endanger the protection and privacy of millions of Americans.
Dodge has not pushed any employees to use a particular tool or product. The Dodge is here to find and fight waste, fraud and abuse.
– Homeland security spokesman who denies that DHS had pressured the DHS staff to use Grook, Reuters, May 2025.
Sources suggest that the DOGE has used a customized version of Grook to improve data processing and reporting, which has encouraged some agencies such as the Department of Homeland Security (DHS) to adopt the device even though they are not approved. Although the DHS has denied that the DOGE staff has been pressured to use Grook, the extension of this technology in federal systems indicates a significant change on how the government handles sensitive information.
At this point, I personally question whether there is any data – and I mean – is not in the power of big government agencies and individuals. Of course, this is not an argument to withdraw from confidentiality. I just am just because of being blasphemous.
Compared to AI companies like Openi, Musk’s Zee, is still relatively new, openly said that Grook users can be monitored for “specific business purposes.” This stand, combined with extensive access to the Dage -Safe official database, has jeopardized privacy supporters. Generally, federal data sharing is subject to strict monitoring and regulatory control for personal information and national security protection. The use of AI tools such as Grook to analyze such data can damage these reservations, especially if data is used for training AI models or accessed by private companies, says relevant critics.
This is not the first time someone is accused of being closely close to the government – and this will certainly not be the last time, not in the near future. Former Google CEO Eric Schmidt and President Obama’s brumives were also widely criticized a few years ago. Schmidt served in the Council of Advisors of Obama on Science and Technology. Although there was no proven legal dispute, some watchdogs and journalists questioned whether Schmidt’s official advisory role could lead to conflicts between public policy and Google’s corporate interests.
Read the latest from Sebastian Pierre


