Some Android users are starting to see blurred images on their devices using Google Messages. It is part of a sensitive material warning system that dissatisfied images containing suspected Iranian. This feature was announced last year and is now running on Android devices.
According to the Google Help Center post, when the feature is turned on, the phone can detect the pictures and fade. When someone is received, shipped or forwarded, it can also create a warning.
The company said in its post that “all the detection and blurry of nude images on the device shows. This feature does not send nude photos found to Google.” These warnings also offer resources to deal with nude images.
According to Google, it is possible that images containing the nudity may be accidentally flagged.
This feature is not active as default for adults, and can be disabled in Google Account Settings for adolescents of 13-17 years. The IT of under -supervised accounts, it cannot be disabled, but parents can adjust the settings in the Google Family Link app.
How to enable or disable the feature
For adults who want to notify nude images or disable the feature, is under the Toggle Switch Google messages settings / Protection and Protection / Management of Sensitive Materials in Google Messages.
The nude content feature is part of the saffron on Android 9 Plus devices. The saffron also includes features that Google is working to protect the scams and dangerous links through text, and confirm contacts.
Measuring the effectiveness of the feature
The screen has become more sophisticated on the screen for objectionable images due to a better understanding of the context through AI.
“Compared to older systems, today’s filters are more expert in catching clear or unwanted content, with less mistakes, compared to older systems,” says Patrick Munnan, co -founder and president of the Tracear Labs. “But they are not foolproof. Arts’ matters, such as artistic nudity, culturally proportional images or even memes can travel to them.”
Mohan says his company flagged the content without compromising a privacy by connecting the AI system with the trust ID tools.
He says, “AI is important to minimize blind spots and keep users safe by connecting with human surveillance and constant feedback.”
Android can offer more flexibility than Apple’s iOS operating system. However, third -party app stores produce more potential entry points for content, which is trying to save people from its openness for saddding and customization.
“The setup of Android can make it difficult to permanently implement, especially for smaller users who can stumble with unintended materials out of crates,” says Moinhan.
‘Baby can make it immediately’
Although Apple offers communication safety properties that parents can turn on, the ability to enable third -party monitoring tools “improves such protection on a scale and more family friendly.” Writer And Chief Parenting Officer in Barack Technologies, who creates digital tools to protect children.
Jordan says the mobile operating system has not made it easy for parents to actively protect from nude images such as content.
She says, “Parents do not need to dig system settings to protect their children.” It indicates that Google’s new feature only temporarily blurred images.
She says, “Children can immediately destabilize it,” she is why she needs to be paired with the ongoing conversation about pressure, consent and stability, as well as surveillance tools that work beyond just one app or operating system. “
According to Moinhan, opting out automatically for adults and opting in for minors is a practical way to offer some initial protection. But he says, “The trick is keeping things transparent. The minors and their guardians need to be cleaned, what is being filtered, how it works, and how their data is saved is needed.”


