Apple plans to scan U.S. iPhones for images of child sexual abuse

0

Apple has announced details of a system to find child sexual abuse material (CSAM) on customers’ devices. The tool designed to detected known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates. Apple will also deploy software that can analyze images in the Messages application for a new system that will “warn children and their parents when receiving or sending sexually explicit photos.”

How it Works

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image with known CSAM hashes,” Apple said. The company claimed the system had an “extremely high level of accuracy and guaranteed less than a one in a trillion per year chance of incorrectly reporting a given account.” Apple says it will manually review each report to confirm there is a match. He can then take action to deactivate a user’s account and report to law enforcement. The company says the new technology offers “significant” privacy advantages over existing techniques – as Apple only learns photos from users if they have a collection of known CSAMs in their iCloud Photos account.

LEAVE A REPLY

Please enter your comment!
Please enter your name here