Security News > 2020 > January > Apple’s scanning iCloud photos for child abuse images

Apple has confirmed that it's automatically scanning images backed up to iCloud to ferret out child abuse images.
Horvath didn't elaborate on the specific technology Apple is using, but whether the company is using its own tools or one such as Microsoft's PhotoDNA, it's certainly not alone in using automatic scanning to find illegal images.
Since 2008, the National Center for Missing & Exploited Children has made available a list of hash values for known child sexual abuse images, provided by ISPs, that enables companies to check large volumes of files for matches without those companies themselves having to keep copies of offending images.
As part of this commitment, Apple uses image matching technology to help find and report child exploitation.
In sum, scanning for child abuse materials isn't new, all the tech giants are doing it, and Apple's not reading actual messages or looking directly at photo content.
News URL
https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/
Related news
- US lawmakers press Trump admin to oppose UK's order for Apple iCloud backdoor (source)
- Apple pulls iCloud end-to-end encryption feature in the UK (source)
- Apple Drops iCloud's Advanced Data Protection in the U.K. Amid Encryption Backdoor Demands (source)
- Rather than add a backdoor, Apple decides to kill iCloud encryption for UK peeps (source)
- UK Demanded Apple Add a Backdoor to iCloud (source)
- Protecting your iCloud data after Apple’s Advanced Data Protection removal in the UK (source)