Apple introduces new child safety features

Apple has officially announced that additional capabilities to safeguard children from online risks would be available in software upgrades later this year for its platforms. Improved safety features in Messages, enhanced detection of Child Sexual Abuse Material by scanning information in iCloud, and an updated Siri and Search are new improvements.

Messages

New Communication Safety features for the iPhone, iPad, and Mac will warn children and parents when receiving or sharing sexually inappropriate photographs. When a youngster gets an explicit image, Apple says the vision will be blurred, and a warning will appear in the Messages app noting the image “may be sensitive.”

If the youngster views the image, their iCloud Family parent will receive a notification “to make sure you’re OK,” according to the pop-up. A link to more information will also be included in the pop-up. When a child is going to transmit a sexually graphic photo, a similar warning will appear. If the youngster chooses to send it, the parent will be notified.

Apple has stated that they will evaluate photographs using on-device machine learning to decide if they are sexually explicit. Apple claims that iMessage is encrypted end-to-end and does not see or have access to any of your messages. It will be an opt-in feature for the new feature.

Scanning Photos for Child Sexual Abuse Material (CSAM)

Apple will detect known CSAM material in iCloud Photos starting with iOS 15 and iPadOS 15, allowing Apple to submit them to the National Center for Missing and Exploited Children. This non-profit organization works in partnership with US law enforcement authorities.

Apple stated that its CSAM detection mechanism was created with user privacy in mind. Apple compares photographs on your device to a database of known CSAM images to check if there is a match. All of the matchings take place on the device.

Siri and Search

Apple also stated that it would be adding more resources to Siri and Spotlight Search across devices to help children and parents stay safe online and obtain help in dangerous situations. Users will ask Siri how to report CSAM or child exploitation, and Siri will direct them to sites that will explain where and how to file a report. When users conduct CSAM-related searches, Siri and Search will be updated to intervene.

All of these changes will be available later this year, with the US getting first dibs.

Source: Apple

+1
0
+1
0
+1
0
+1
0
+1
0
+1
0
+1
0

Enter your email and get notified when new content is added!

We don’t spam! Read our privacy policy for more info.

Leave a Reply

Your email address will not be published. Required fields are marked *

Send this to a friend