Apple says a child abuse detection system will check photos in iCloud | Inquirer
 
 
 
 
 
 

Apple says a child abuse detection system will check photos in iCloud

/ 07:44 AM August 10, 2021

Apple Inc on Monday said that iPhone users’ entire photo libraries would be checked for known child abuse images if they are stored in the online iCloud service.

The disclosure came in a series of media briefings in which Apple is seeking to dispel alarm over its announcement last week that it will scan users’ phones, tablets, and computers for millions of illegal pictures.

While Google, Microsoft, and other technology platforms check uploaded photos or emailed attachments against a database of identifiers provided by the National Center for Missing and Exploited Children and other clearinghouses, security experts faulted Apple’s plan as more invasive.

Some said they expected that governments would seek to force the iPhone maker to expand the system to peer into devices for other material.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

Apple says a child abuse detection system will check photos in iCloud

The Apple Inc logo is seen at the entrance to the Apple store in Brussels, Belgium, on July 2, 2021. REUTERS/Yves Herman/File Photo

In a posting to its website on Sunday, Apple said it would fight any such attempts, which can occur in secret courts.

“We have faced demands with building and deploying government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands,” Apple wrote. “We will continue to refuse them in the future.”

In the briefing on Monday, Apple officials said the company’s system, which will roll out this fall with the release of its iOS 15 operating system, will check existing files on a user’s device if users have those photos synced to the company’s storage servers.

ADVERTISEMENT

Julie Cordua, chief executive of Thorn, a group that has developed technology to help law enforcement officials detect sex trafficking, said about half of child sexual abuse material is formatted as video.

Apple’s system does not check videos before they are uploaded to the company’s cloud, but the company said it plans to expand its system in unspecified ways in the future.

Apple has come under international pressure for the low numbers of its reports of abuse material compared with other providers. Some European jurisdictions are debating legislation to hold platforms more accountable for the spread of such material.

ADVERTISEMENT

Company executives argued on Monday that on-device checks preserve privacy more than running checks on Apple’s cloud storage directly. Among other things, the architecture of the new system does not tell Apple anything about a user’s content unless a threshold number of images has been surpassed, which then triggers a human review.

The executives acknowledged that a user could be implicated by malicious actors who win control of a device and remotely install known child abuse material. But they said they expected any such attacks to be very rare and that in any case, a review would then look for other signs of criminal hacking.

(Reporting by Joseph Menn and Stephen Nells in San Francisco; Editing by Dan Grebler)

Want stories like this delivered straight to your inbox? Stay informed. Stay ahead. Subscribe to InqMORNING

Don't miss out on the latest news and information.
TAGS: Apple, child abuse, Software
For feedback, complaints, or inquiries, contact us.
Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.




This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.