Home > Latest News > Apple Prepares Staff For Child Porn Queries

Apple Prepares Staff For Child Porn Queries

Apple has issued a memo to its retail and digital sales staff on how to field the slew of questions the company is expecting regarding the company’s controversial new features aimed at clamping down on child abuse images.

Last fortnight, Apple announced a feature in iMessages that scans children’s devices for inappropriate incoming or outgoing images, and a separate feature for iCloud that scans libraries for already-flagged explicit imagery.

The release of these two features on the same day has confused many, and stirred up concerns that this is a slippery slop to Apple monitoring users’ communications. Apple has confirmed it will refuse any government requests to utilise these features to spy on users

The memo issued to staff included a full FAQ, and the following message:

“You may be getting questions about privacy from customers who saw our recent announcement about Expanded Protections for Children. There is a full FAQ here to address these questions and provide more clarity and transparency in the process.

“We’ve also added the main questions about CSAM detection below. Please take time to review the below and the full FAQ so you can help our customers.”

 



You may also like
Apple, Google, Meta Face EU Non-Compliance Investigations
Apple Hit With New Lawsuits From iPhone Customers
Is It Time That Local Authorities Move On Apple In An Effort To Stop Their Monopoly Practises?
Qualcomm Claims Most Windows Games Will Work With Its Upcoming Chipset
iPad Pro Leak Hints At Ultra Slim Bezels