Following a report on work the company was doing to create a tool that scans iPhones for child abuse images, Apple has published a post that provides more details on its efforts related to child safety. With the release of iOS 15, watchOS 8 and macOS Monterey later this year, the company says it will introduce a variety of child safety features across Messages, Photos and Siri.

To start, the Messages app will include new notifications that will warn children, as well as their parents, when they either send or receive sexually explicit photos. When someone sends a child an inappropriate image, the app will blur it and display several warnings. “It’s not your fault, but sensitive photos and videos can be used to hurt you,” says one of the notifications, per a screenshot Apple shared.
As an additional precaution, the company says Messages can also notify parents if their child decides to go ahead and view a sensitive image. “Similar protections are available if a child attempts to send sexually explicit photos,” according to Apple. The company notes the feature uses on-device machine learning to determine whether a photo is explicit. Moreover, Apple does not have access to the messages themselves. This feature will be available to family iCloud accounts.

Apple will also introduce new software tools in iOS and iPadOS that will allow the company to detect when someone uploads content to iCloud that shows children involved in sexually explicit acts. The company says it will use the technology to notify the National Center for Missing and Exploited Children (NCMEC), which will in turn work with law enforcement agencies across the US. “Apple’s method of detecting known CSAM [Child Sexual Abuse Material] is designed with user privacy in mind,” the company claims.

Rather than scanning photos when they’re uploaded to the cloud, the system will use an on-device database of “known” images provided by NCMEC and other organizations. The company says that the database assigns a hash to the photos, which acts as a kind of digital fingerprint for them.

A cryptographic technology called private set intersection allows Apple to determine if there’s a match without seeing the result of the process. In the event of a match, an iPhone or iPad will create a cryptographic safety voucher that will encrypt the upload, along with additional data about it. Another technology called threshold secret sharing makes it so that the company can’t see the contents of those vouchers unless someone passes an unspecified threshold of CSAM content. “The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account,” according to the company.

It’s only when that line is passed that the technology Apple plans to implement will allow the company to review the contents of the vouchers. At that point, the tech giant says it will manually review each report to confirm there’s a match. In cases where there is one, it will disable the individual’s iCloud account and forward a report to NEMEC. Users can appeal a suspension if they believe their account has been mistakenly flagged.

Siri Child Safety
Apple

Lastly, Siri, as well as the built-in search feature found in iOS and macOS, will point users to child safety resources. For instance, you’ll be able to ask the company’s digital assistant how to report child exploitation. Apple also plans to update Siri to intervene when someone tries to conduct any CSAM-related searches. The assistant will explain “that interest in this topic is harmful and problematic,” as well as point the person to resources that offer help with the issue.

Apple’s decision to effectively work with law enforcement agencies is likely to be seen as something of an about-face for the company. In 2016, it refused to help the FBI unlock the iPhone that had belonged to the man behind the San Bernardino terror attack. Although the government eventually turned to an outside firm to access the device, Tim Cook called the episode “chilling” and warned it could create a backdoor for more government surveillance down the road.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.



Source link

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.