Latest iOS beta introduces new feature to protect children from 'sexually explicit' content

By ANI | Published: November 12, 2021 03:21 PM2021-11-12T15:21:50+5:302021-11-12T15:30:07+5:30

The latest iOS 15.2's beta has recently added a new feature that blurs nude images for children using the messages app, in order to protect them from 'sexually explicit' material.

Latest iOS beta introduces new feature to protect children from 'sexually explicit' content | Latest iOS beta introduces new feature to protect children from 'sexually explicit' content

Latest iOS beta introduces new feature to protect children from 'sexually explicit' content

The latest iOS 15.2's beta has recently added a new feature that blurs nude images for children using the messages app, in order to protect them from 'sexually explicit' material.

According to The Verge, the latest feature will scan the incoming and outgoing pictures in order to protect the younger generation from 'sexually explicit' material.

The images that will fall under the mentioned category will be blurred, and the child will be warned about its contents. The feature, which ties into Apple's existing 'Family Sharing' system, is also designed to offer resources to affected children for them to get help.

The latest feature has one crucial difference from what Apple originally announced in August. It will not send notifications to parents if a child decides to view a sexually explicit image.

As per The Verge, critics like Harvard Cyberlaw Clinic instructor Kendra Albert objected to this element in particular because it could out queer or transgender children to their parents. Reports also informed that in its original form, the feature could have introduced safety issues when a parent is violent or abusive.

The Verge notes that children will also be able to alert someone they trust about a flagged photo and that this choice is separate from the choice of whether to unblur and view the image. The checks will be carried out on-device, and will not impact end-to-end encryption.

This Communication Safety feature was originally announced in August as part of a trio of features designed to protect children from sexual abuse. The company said it was delaying the introduction of the features the following month in response to objections raised by privacy advocates.

As per The Verge, the Communication Safety feature is distinct from the CSAM-detection (child sexual abuse imagery detection) feature.

CSAM-detection scans a user's iCloud Photos and reports offending content to Apple moderators, and which generated the bulk of the outcry from privacy advocates.

There is also an update coming to Siri search that's designed to offer resources if a user searches for topics relating to child sexual abuse, as per The Verge. However, it is still unclear when these two features are planned for release. There haven't been reports of them appearing in Apple's public beta software, too.

The Verge also noted that the features added to iOS 15.2's latest beta could still change dramatically before its official release, and it may be removed from the update entirely.

Other new features that have arrived in the latest beta include a manual AirTag scanning feature, as well as the option to pass on your iCloud data to a loved one in the case of your death.

( With inputs from ANI )

Disclaimer: This post has been auto-published from an agency feed without any modifications to the text and has not been reviewed by an editor

Open in app