Home > Latest News > Apple’s Attempt To Target Child Abuse Invades Privacy

Apple’s Attempt To Target Child Abuse Invades Privacy

Trying to detect child sexual abuse images (CSAM) on iPhones, Apple have fallen foul with a group of cybersecurity experts who claim their efforts are actually mass surveillance and should be banned.

Apple had planned to bring in Client-Side Scanning (CSS) earlier this year, a move that would search individual devices for CSAM. Using tech called NeuralHash, images would be scanned and compared with known CSAM material. Offenders would then be reported.

Apple delayed these plans last month, explaining feedback had led them to look for improvements.

Now, a panel of experts have presented a paper about the risks of CSS, saying the tech goes too far, stating, “CSS neither guarantees efficacious crime prevention nor prevents surveillance.

“Indeed, the effect is the opposite. CSS by its nature creates serious security and privacy risks for all society while the assistance it can provide for law enforcement is at best problematic.”

While Apple insist CSS would only be used to flag CSAM and terrorist material, researchers say it could still be used by repressive governments for other reasons.

“Instead of having targeted capabilities such as to wiretap communications with a warrant and to perform forensics on seized devices, the agencies’ direction of travel is the bulk scanning of everyone’s private data, all the time, without warrant or suspicion.”

You may also like
Xiaomi’s Income Takes 84% Hit As Market Share Erodes
Apple Warns Activists: The State Hacked Your iPhones
TSMC To Make Custom iPhone 5G Modem
Apple Sues Israeli Surveillance Company NSO Group
Apple Working On A Retractable Macbook Keyboard