Apple is set to roll out a new child safety feature in the United Kingdom that scans iPhone messages for nudity. According to Apple's announcement, the new feature is called "communication safety in Messages" and uses AI technology to scan messages sent to and from children's iPhones.
The feature will essentially allow parents to turn on warnings on their children's iPhones. When it is enabled, all images will be scanned for nudity.
What happens when the feature is turned on? Nude images received or sent by children will be blurred if exchanged through the Messages app. In addition, Apple will send out a warning to kids saying that the images contain sensitive content, and later redirect kids to safety groups.
Also read:?Apple Could Soon Discontinue Its Best-Selling iPhone 11 Series, Report Claims
When a child attempts to send nude photos, similar mechanisms will kick in. Then, Apple discourages children from sending the images, displaying the option to "Message a Grown-Up."
Apple claims that all analysis of images takes place on "the device" itself. What does this mean? Apple claims that it never sees either the images or the results. "Messages analyses image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages," Apple said in a statement.
Also read:?Leaked iPhone 14 Mould Shows Apple Is Focusing On Bigger Screens And Camera Bumps
An earlier pitch of the feature had a controversial feature. Apple had plans to set up an alert system that would immediately tell parents if kids under 13 were sending or receiving nude images. But Apple decided not to stick with this feature.
What do you think about this measure? Do you think it's helpful in protecting kids? Let us know in the comments now.?For more in the world of?technology?and?science, keep reading?Indiatimes.com.?
References
Hern, A. (2022, April 21). Apple to roll out child safety feature that scans messages for nudity to UK iPhones. The Guardian.?