Apple to probe iCloud photo uploads for child abuse images | Inquirer
 
 
 
 
 
 

Apple to probe iCloud photo uploads for child abuse images

/ 05:56 AM August 06, 2021

Apple Inc on Thursday said it would implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse.

Detection of child abuse image uploads sufficient to guard against false positives will trigger a human review of and report the user to law enforcement, Apple said. It said the system is designed to reduce false positives to one in one trillion.

Apple’s new system seeks to address requests from law enforcement to help stem child sexual abuse while respecting privacy and security practices that are a core tenet of the company’s brand. But some privacy advocates said the system could open the door to monitoring political speech or other content on iPhones.

ADVERTISEMENT

Most other major technology providers – including Alphabet Inc’s Google, Facebook Inc, and Microsoft Corp – are already checking images against a database of known child sexual abuse imagery.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

“With so many people using Apple products, these new safety measures have the lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,” John Clark, chief executive of the National Center for Missing & Exploited Children, said in a statement. “The reality is that privacy and child protection can co-exist.”

Apple to probe on iCloud photo uploads for child abuse images

The Apple Inc logo is seen at the entrance to the Apple store in Brussels, Belgium, on July 2, 2021. REUTERS/Yves Herman/File Photo

Here is how Apple’s system works. Law enforcement officials maintain a database of known child sexual abuse images and translate them into “hashes” – numerical codes that positively identify the image but cannot be used to reconstruct them.

Apple has implemented that database using a technology called “NeuralHash,” designed to catch edited images similar to the originals. That database will be stored on iPhones.

When a user uploads an image to Apple’s iCloud storage service, the iPhone will create a hash of the image to be uploaded and compare it against the database.

Photos stored only on the phone are not checked, Apple said, and human review before reporting an account to law enforcement is meant to ensure any matches are genuine before suspending an account.

Apple said users who feel their account was improperly suspended could appeal to have it reinstated.

ADVERTISEMENT

The Financial Times earlier reported some aspects of the program.

One feature that sets Apple’s system apart is that it checks photos stored on phones before they are uploaded, rather than checking the photos after they arrive on its servers.

On Twitter, some privacy and security experts expressed concerns the system could eventually be expanded to scan phones more generally for prohibited content or political speech.

Apple has “sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Matthew Green, a security researcher at Johns Hopkins University, warned.

“This will break the dam — governments will demand it from everyone.”

In a blog post, other privacy researchers such as India McKinney and Erica Portnoy of the Electronic Frontier Foundation wrote that it might be impossible for outside researchers to double-check whether Apple keeps its promises to check only a small set of on-device content.

The move is “a shocking about-face for users who have relied on the company’s leadership in privacy and security,” the pair wrote.

“At the end of the day, even a thoroughly documented, carefully thought-out, and the narrowly-scoped backdoor is still a backdoor,” McKinney and Portnoy wrote.

(Reporting by Stephen Nellis in San FranciscoEditing by Marguerita Choy and David Gregorio)

Don't miss out on the latest news and information.
TAGS: Apple, child abuse, iPhone
For feedback, complaints, or inquiries, contact us.
Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.




We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.