Google is rolling out a new Sensitive Content Warning system that has started to show up on Android phones. Some users have noticed that Google Messages is blurring images containing suspected nudity. It’s intended to protect users from unwanted nudity in their images and was announced last year. 

According to Google’s Help Center post, when the feature is turned on, the phone can detect and blur images with nudity. It can also generate a warning when one is being received, sent or forwarded. 

“All detection and blurring of nude images happens on the device. This feature doesn’t send detected nude images to Google,” the company says in its post. These warnings also offer resources on how to deal with nude images.

It’s possible that images not containing nudity may be accidentally flagged, according to Google.

Don’t miss any of CNET’s unbiased tech content and lab-based reviews. Add us as a preferred Google source on Chrome.

The nude content feature is part of SafetyCore on Android 9 plus devices. SafetyCore also includes features Google has been working on to protect against scams and dangerous links via text and to verify contacts.

Compared to Apple’s iOS operating system, Android can offer more flexibility. However, its openness to third-party app stores, sideloading and customization creates more potential entry points for the kind of content Google is trying to protect people against.

‘Kids can unblur it instantly’

While Apple does offer Communication Safety features that parents can turn on, Android’s ability to enable third-party monitoring tools “makes this kind of protection easier to roll out at scale and more family-friendly,” says Titania Jordan, an author and chief parenting officer at Bark Technologies, which makes digital tools to protect children.

“Parents shouldn’t have to dig through system settings to protect their kids,” she says. She points out that Google’s new feature only blurs images temporarily.

“Kids can unblur it instantly,” she says, “That’s why this needs to be paired with ongoing conversations about pressure, consent, and permanence, plus monitoring tools that work beyond just one app or operating system.”

According to Moynihan, making the system automatically opt-out for adults and opt-in for minors is a practical way to offer some initial protection. But he says, “The trick is keeping things transparent. Minors and their guardians need clear, jargon-free info about what’s being filtered, how it works, and how their data is protected.”


Source: CNET.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

The reCAPTCHA verification period has expired. Please reload the page.