You wouldn’t think that efforts to thwart the dissemination of child abuse pornography would be met with backlash, but critics are referring to Apple’s recently announced plans to scan iPhone users’ photos for child sexual abuse material (CSAM) as nothing more than an Apple-built surveillance tool, which may be used by governments.
Apple has responded to this criticism with a new QnA, in which it derails such fears.
“CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images,” Apple explains.
“CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.”
Elsewhere, the company also ensures users that “Apple never gains access to communications as a result of this feature in Messages”, and that “this feature does not share any information with Apple, NCMEC or law enforcement.”
It also explains that “this doesn’t change the privacy assurances of Messages, and Apple never gains access to communications as a result of this feature. Any user of Messages, including those with with communication safety enabled, retains control over what is sent and to whom.”
To read exactly how the feature will be use, check out Apple’s Expanded Protections for Children Frequently Asked Questions.