Home > Latest News > Apple Prepares Staff For Child Porn Queries

Apple Prepares Staff For Child Porn Queries

Apple has issued a memo to its retail and digital sales staff on how to field the slew of questions the company is expecting regarding the company’s controversial new features aimed at clamping down on child abuse images.

Last fortnight, Apple announced a feature in iMessages that scans children’s devices for inappropriate incoming or outgoing images, and a separate feature for iCloud that scans libraries for already-flagged explicit imagery.

The release of these two features on the same day has confused many, and stirred up concerns that this is a slippery slop to Apple monitoring users’ communications. Apple has confirmed it will refuse any government requests to utilise these features to spy on users

The memo issued to staff included a full FAQ, and the following message:

“You may be getting questions about privacy from customers who saw our recent announcement about Expanded Protections for Children. There is a full FAQ here to address these questions and provide more clarity and transparency in the process.

“We’ve also added the main questions about CSAM detection below. Please take time to review the below and the full FAQ so you can help our customers.”

 

You may also like
Apple Training Video Shows How To Talk People Into Buying Expensive Repairs
Apple Against EU Push For USB-C chargers Across All Phones
Mobile Gaming Will Be $158 Billion Industry By 2024
Apple Staff Turn On iPhone Maker CEO Riled
Apple Bans Fortnite Indefinitely From App Store