Apple explains how iPhones will scan photos for child-sexual-abuse images

Close-up shot of female finger scrolling on smartphone screen in a dark environment.

Enlarge (credit: Getty Images | Oscar Wong)

reports today that Apple will start scanning iPhones for child-abuse images, the company confirmed its plan and provided details in a news release and technical summary.

“Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind,” Apple’s announcement said. “Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”

Apple provided more detail on the CSAM detection system in a technical summary and said its system uses a threshold “set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.”

Read 19 remaining paragraphs | Comments

Discover more from WHO WILL CARE eCommerce

Subscribe now to keep reading and get access to the full archive.

Continue reading