Security News > 2021 > September > Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash
Apple is temporarily hitting the pause button on its controversial plans to screen users' devices for child sexual abuse material after receiving sustained blowback over worries that the tool could be weaponized for mass surveillance and erode the privacy of users.
In August, Apple detailed several new features intended to help limit the spread of CSAM on its platform, including scanning users' iCloud Photos libraries for illicit content, Communication Safety in Messages app to warn children and their parents when receiving or sending sexually explicit photos, and expanded guidance in Siri and Search when users try to perform searches for CSAM-related topics.
The so-called NeuralHash technology would have worked by matching photos on users' iPhones, iPads, and Macs just before they are uploaded to iCloud Photos against a database of known child sexual abuse imagery maintained by the National Center for Missing and Exploited Children without having to possess the images or glean their contents.
The measures aimed to strike a compromise between protecting customers' privacy and meeting growing demands from government agencies in investigations pertaining to terrorism and child pornography - and by extension, offer a solution to the so-called "Going dark" problem of criminals taking advantage of encryption protections to cloak their contraband activities.
In an email circulated internally at Apple, child safety campaigners were found dismissing the complaints of privacy activists and security researchers as the "Screeching voice of the minority."
Apple has since stepped in to assuage potential concerns arising out of unintended consequences, pushing back against the possibility that the system could be used to detect other forms of photos at the request of authoritarian governments.