Apple Has Been Scanning Your iCloud Mail For Years
Apple has confirmed it already scans iCloud Mail for child sex abuse material (CSAM), and has been doing so since 2019.
An investigation by 9to5Mac was piqued by “a rather odd statement” by Apple’s anti-fraud chief, who explained how Apple was “the greatest platform for distributing child porn” – which leads one to conclude the only way the company could know this is if they scan iCloud photos.
It turns out they do, and they have been doing so since at least 2019. Note the following wording from the company’s old child protection policy:
“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used […]
“As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.”
Jane Horvath, Apple’s chief privacy officer, also told an audience at a January 2020 tech conference that Apple uses screening technology to search iCloud for illegal pictures.
Apple has now confirmed to 9to5Mac it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019.
“Apple also indicated that it was doing some limited scanning of other data, but would not tell me what that was, except to suggest that it was on a tiny scale,” journalist Ben Lovejoy writes.
“It did tell me that the ‘other data’ does not include iCloud backups,”