Apple announced that it would launch a new feature later this year that will scan photos stored on iPhones and iCloud for child abuse imagery.
The new tool can help law enforcement and criminal investigations crackdown on abuse against minors and open the door to more legal and government demands for user data.
The system is called neuralMatch and will “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the Financial Times said.
The tool was trained using 200,000 images from the National Center for Missing & Exploited Children. The Verge reported that the U.S. would be the first to roll out the features by hashing and comparing photos with a database of known images of child sexual abuse.
“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” the Financial Times said.
“Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”
Matthew Green, a John Hopkins University professor, and cryptographer voiced his concerns on the new system. On Wednesday, he tweeted: “This sort of tool can be a boon for finding child pornography in people’s phones.”
“But imagine what it could do in the hands of an authoritarian government?”
“Even if you believe Apple won’t allow these tools to be misused [crossed fingers emoji], there’s still a lot to be concerned about,” he added. “These systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”
Apple already has a system in place that checks iCloud files against known child abuse imagery. However, the new system will go further, allowing central access to local storage.
Apple has already informed some US academics and may share more news about the feature soon.
Discover more from Baller Alert
Subscribe to get the latest posts sent to your email.