Apple says people are confused about their two new child safety features.
Earlier this month, Apple announced it would be updating iPhones with a new feature that scans an iPhone for any illegal images uploaded to the iCloud storage. However, since then, the company has been called out by privacy groups allegedly feel Apple has created a security backdoor in its software, BBC reports. The platform says their plan has been “misunderstood.
“We wish that this had come out a little more clearly for everyone,” said Apple software chief Craig Federighi, in an interview with the Wall Street Journal, adding that two features at the same time led to “a recipe for this kind of confusion.”
The child protection tool Apple is implementing is child abuse material (CSAM), which scans for illegal content involving children. It will then alert the U.S. National Center for Missing and Exploited Children (NCMEC), which will then store the illegal content as a “fingerprint.” The second is feature is a parental control ability that allows parents to activate their children’s phones.
Discover more from Baller Alert
Subscribe to get the latest posts sent to your email.