Reports today indicate Apple plans to scan photos stored on peoples’ iPhones and iCloud accounts for imagery suggesting child abuse. The effort may aid in law-enforcement investigations, but it could also invite controversial access to user data by government agencies.
Apple’s update to its web page “Expanded Protections for Children” — see under the “CSAM Detection” subheading — appears to make the scanning plan official. CSAM stands for “child sexual abuse material.”
The system Apple is putting in place, called neuralMatch, will “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the Financial Times reported earlier today.
Apple told academics in the U.S. about the plan this week and could share more “as soon as this week,” said two security researchers briefed on the matter, FT said.
Suspect photos compared to database images
The newspaper added that experts “trained” neuralMatch using 200,000 images from the National Center for Missing & Exploited Children. Suspect photos will be hashed and then compared with images of child sexual abuse in a database.
The system rolls out first in the U.S. and later elsewhere, FT reported.
“According to people briefed on the plans, every photo uploaded to iCloud in the U.S; will be given a ‘safety voucher,’ saying whether it is suspect or not,” FT said. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”
Concerns over the system
TheVerge noted that John Hopkins University professor and cryptographer Matthew Green raised concerns about the system on Twitter. “This sort of tool can be a boon for finding child pornography in people’s phones,” Green said. “But imagine what it could do in the hands of an authoritarian government?”
“Even if you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” he added. “These systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”
Apple and others already do something similar
Apple and other major cloud providers already check files against known images of child abuse. But the neuralMatch system goes beyond what they do, as it allows centralized access to local storage, TheVerge said.
And observers have noted it would be easy to extend the system to other crimes. In a country like China, where Apple does considerable business, that kind of access and legal application could be concerning.
Apple on privacy
Apple has made much of its devices’ privacy protections, including recent statements by Tim Cook in a video directed at the privacy-conscious European Union.
The company dominated the new briefly when it resisted assisting the FBI’s demand that it provide the agency access to an iPhone belonging to a shooter in a 2015 attack in San Bernardino, California.
News outlets reported today that Apple has not responded to requests for comment on its plan to scan images.
Apple plans to scan iPhones and iCloud for child abuse imagery [Updated]
Source: Pinays Guide
0 Comments