Security News > 2021 > September > Apple stalls CSAM auto-scan on devices after 'feedback' from everyone on Earth

Apple stalls CSAM auto-scan on devices after 'feedback' from everyone on Earth
2021-09-03 20:48

Apple on Friday said it intends to delay the introduction of its plan to commandeer customers' own devices to scan their iCloud-bound photos for illegal child exploitation imagery, a concession to the broad backlash that followed from the initiative.

Last month, Apple announced its child safety initiative, which involves adding a nudity detection algorithm to its Messages chat client, to provide a way to control the sharing of explicit images, and running code on customer's iDevices to detect known child sexual abuse material among on-device photos destined for iCloud storage.

As NSA whistleblower Edward Snowden put it, "Apple plans to erase the boundary dividing which devices work for you, and which devices work for them."

"With good intentions, Apple has ​​paved the road to mandated security weakness around the world, enabling and reinforcing the arguments that, should the intentions be good enough, scanning through your personal life and private communications is acceptable."

Apple - rather than actually engaging with the security community and the public - published a list of Frequently Asked Questions and responses to address the concern that censorious governments will demand access to the CSAM scanning system to look for politically objectionable images.

"Could governments force Apple to add non-CSAM images to the hash list?" the company asked in its interview of itself, and then responded, "No. Apple would refuse such demands and our system has been designed to prevent that from happening."


News URL

https://go.theregister.com/feed/www.theregister.com/2021/09/03/apple_scanning_pause/

Related vendor

VENDOR LAST 12M #/PRODUCTS LOW MEDIUM HIGH CRITICAL TOTAL VULNS
Apple 135 565 4110 1589 2424 8688