Tuesday 14 December 2021

Latest iOS Update Employs Nudity-Detecting Algorithms in Children’s Version of Messenger

(Image: Apple)
Apple’s latest safety initiative incorporates nudity detection into messages sent to and by children. The feature rolled out to iPhones and iPads with iOS 15.2, which became available Monday.

Though there isn’t much information on how the technology works behind the scenes, journalist Mark Gurman’s Power On newsletter offers a bit of insight into how it manifests on a user’s device. The feature isn’t automatically enabled, but once implemented on a device with a family-sharing account, it will scan for nudity within images sent and received by the Messages app. If nudity is detected, Messages blurs the image and displays an on-screen warning, which explains the dangers of sharing explicit photos and asks whether the viewer would like to proceed. 

Despite some confusion over parental involvement in the feature, guardians are not immediately notified upon relay of a nude photo on a child’s device. (This is rumored to have been Apple’s original plan.) Instead, the child has the option to “message a grown-up” and alert a parent themselves.

It’s important to note that the new child safety feature isn’t the same as Apple’s planned ability to scan users’ iCloud photos for child pornography—a separate safety measure that was also announced this year. The latter works by covertly scanning Apple device users’ photo albums for child pornography, then sending suspect images to moderators. If the moderators confirm the presence of dangerous material, they disable the offending user’s account and forward the material to law enforcement, as well as the National Center for Missing and Exploited Children. In response to widespread concern over user privacy, Apple has paused its plan to enable this feature, which was originally slated to arrive by the end of 2021. 

A new yet lesser-known iOS 15.2 feature intervenes when users search for exploitative content. (Image: Apple)

Both features are part of an effort from Apple to quell child exploitation on its devices. The company introduced one more initiative to the same end with iOS 15.2, which attempts to redirect users who make queries related to child pornography. Following a problematic search, Siri prompts the user to either report instances of child exploitation or reach out to an anonymous helpline dedicated to those with at-risk thoughts. Apple has supposedly partnered with various resources to help prevent bad actors from causing harm.

Now Read:



No comments:

Post a Comment