Confusion continues after Apple first announced its intention to scan photos uploaded to iCloud for known images of child sexual abuse (CSAM)
Privacy advocates have opposed in strong terms this move, in which the scan will be performed on the hardware itself before it is uploaded to iCloud To make matters even more confusing, Apple states in its FAQ [PDF] that if users choose not to use iCloud, the feature is essentially disabled Privacy activists are concerned that this move could lead to pressure from authoritarian governments to expand this feature to Apple in order to crack down on dissident activity
To settle the controversy, Apple has issued several explanations As reported by Reuters, Apple now claims that its scanners only search CSAM images that have been flagged by multiple national clearinghouses, making it easy for researchers to verify that image identifiers are universal across devices and cannot be adapted to target individuals The company states that this proves that it cannot be done
The company also added that the system prompts Apple for a human review and requires 30 matched CSAM images before a formal report is submitted This is part of the reason why Apple believed it could promise a false positive likelihood of less than one in a trillion per year
Apple declined to say whether these were adjustments made in the face of criticism or specifics that had always been in place, but added that changes should be expected since the policy is still under development
Nonetheless, privacy advocates believe they are making a difference Stanford University surveillance researcher Riana Pfefferkorn tweeted, " Keep pushing"
More recently, Craig Federighi, Apple's vice president of software engineering, told the Wall Street Journal that Apple's new policy is "much more private than what has been done in this area"
"We think we are absolutely leading the way on privacy, but what we are doing here is the latest technological advance in privacy that will enable a more private world," he said Adding that the system was developed "in the most privacy-protective way we can imagine, in the most auditable and verifiable way possible," he said the company's solution is superior to cloud storage competitors that look at and analyze "every single photo"
Federighi claimed that critics did not fully understand Apple's implementation and believes the company is partly to blame for not explaining things clearly; by announcing CSAM scans at the same time as protections for minors using iMessage, the two were mistakenly conflated, he He conceded that the two were "inadvertently conflated
"We feel very positive and strongly about what we are doing and wish this had come across a little more clearly to everyone," he said
The word "we" in this sentence may imply more unified support within the company than actually exists On Friday, Reuters revealed that the move was equally divisive within the company, with more than 800 messages on the plan posted on the company Slack
Comments