iPhone, iPad Update Detects, Reports Child Sexual Abuse Images
Apple has announced new child protection features that will automatically scan iPhone and iPad users’ photos to detect and report any child sexual abuse images stored on its cloud servers.
“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple said in a statement.
“This program is ambitious, and protecting children is an important responsibility,” it said.
“Our efforts will evolve and expand over time.”
Apple explained that only those with large collections of images uploaded to iCloud that match a database of known child abuse material would be flagged.
Apple’s chief privacy officer, Erik Neuenschwander, said that “the system was designed and trained” for this purpose, and that although users won’t be notified if their photos are flagged, decrypted and sent to law enforcement, their accounts will be disabled.
This will roll out “later this year”, according to Apple.