Apple Rolling Out Its Feature That Blurs Out Nudity in Messages

  • Olivia McCrorie
Apple Rolling Out Its Feature That Blurs Out Nudity in Messages

How do you protect kids from receiving nude pictures from strangers and, for example, sending them when asked? Apple has announced its AI-assisted protection system that blurs the image if it detects any explicit nudity in it. Since December, it’s been available in the US: now, the Cupertino company makes it available all over the world.

How does this system function? On the recipient’s side, the local AI analyzes a picture that a kid receives on iMessage and then applies the filters it if the AI decides it qualifies. Then, the young recipient can only see it in a blurred state. But the AI cannot be fully trusted, and often a human intervention is required.

So, this mechanism operates under the family's control. It’s the parent or other family member that activates it for a kid under Family Sharing settings. If a message makes it to the kid’s phone and the kid lets the parent knows, the adult in charge gets notified about it and can view the image. In case of misidentification, the adult can remove the blur and allow the image to be viewed. Preciously, it worked automatically, but Apple preferred manual controls in order to prevent forced outings of LGBTQ+ children and provide all of them with a certain privacy level even within the family.

Last but not least: if the kid receives a potentially risky image, the system warns them that it’s not necessary to reply or continue the conversation at all. It also reminds the young recipient that they can just block the sender or/and request help from adults or professionals. This reminder pops up each time a potentially explicit and thus blurred image is received.

Do you think it will work? Will those sending these photos find a way around it? And will the cure be efficient, or will there be unwanted side effects? Let’s hope it’s for the better. And if you have something to add, welcome to our comments section!