Texas Attorney General Ken Pictenon has announced plans for investigating both Meta AI studio and character to offer AI chat boats that can claim to be health tools, and potentially misuse data collected from underage users.
Picters says that AI chat boats belonging to any platform can “present yourself as a professional treatment tool” at the point of lying about their abilities. The behavior that can lead to misleading young consumers and misinformation. Since the AI platform often relys on the user’s indicators, which is another source of training data, any company can also violate young user privacy and misuse their data. This is a particular interest in Texas, where the Skype Act has certain limits on what companies can do with minor cut data, and it requires platform offer tools so that parents can manage the privacy settings of their children’s accounts.
For now, the Attorney General has submitted civil investigative demands (CID) to both Meta and Character. To see if any company is violating Texas consumer protection laws. As if Takkarch Note, neither meta nor character. AI claims that his AI chat boot platform should be used as a mental health tool. This does not refrain from having numerous “therapists” and “psychologist” chat boats at Character Dot AI. Nor does it prevent any of the companies’ chat boats from claiming that they are licensed professionals, such as 404 Media Reported in April.
“To comment on the Texas investigation, it was said that” the user -created roles on our site are imaginary, their aim is for fun, and we have taken strong steps to make it clear. “” For example, we have a significant resignation in every chat so that users are not a real person to remind users that a character is not a real person and what the role says should be considered fiction. “
Meta shared the same emotions in her comment. The company said, “We clearly label the AIS, and to help people better understand their limits, we add a withdrawal that answers are created by AI – not people.” Meta AIS should also “instruct users to find eligible medical or safety professionals when appropriate.” It is good to send people to real resources, but eventually it is easy to neglect those who rescue, and do not work so much.
In connection with the use of privacy and data, the Meta privacy policy and role. Both AI’s privacy policy recognizes that data is collected from consumer talks with AI. Meta collects things like indicators and feedback to improve the performance of AI. Character. Identifiers and settlement information logs in and says that information can be used for advertisement, in addition to other applications. Either the policy applies to children, and fits with the Texas scope act, it seems that it will depend on how easy it is to create an account.


