Security News

Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash
2021-09-06 03:11

Apple is temporarily hitting the pause button on its controversial plans to screen users' devices for child sexual abuse material after receiving sustained blowback over worries that the tool could be weaponized for mass surveillance and erode the privacy of users. In August, Apple detailed several new features intended to help limit the spread of CSAM on its platform, including scanning users' iCloud Photos libraries for illicit content, Communication Safety in Messages app to warn children and their parents when receiving or sending sexually explicit photos, and expanded guidance in Siri and Search when users try to perform searches for CSAM-related topics.

Apple stalls CSAM auto-scan on devices after 'feedback' from everyone on Earth
2021-09-03 20:48

Apple on Friday said it intends to delay the introduction of its plan to commandeer customers' own devices to scan their iCloud-bound photos for illegal child exploitation imagery, a concession to the broad backlash that followed from the initiative. Last month, Apple announced its child safety initiative, which involves adding a nudity detection algorithm to its Messages chat client, to provide a way to control the sharing of explicit images, and running code on customer's iDevices to detect known child sexual abuse material among on-device photos destined for iCloud storage.

Apple launches service program for iPhone 12 no sound issues
2021-08-29 14:00

Apple has announced a new free-of-charge service program for iPhone 12 and iPhone 12 Pro devices experiencing sound issues caused by a receiver module component. "Apple has determined that a very small percentage of iPhone 12 and iPhone 12 Pro devices may experience sound issues due to a component that might fail on the receiver module," the company said in a new support document.

Fake Apple rep amasses 620,000+ stolen iCloud pics, vids in hunt for images of nude women to trade
2021-08-24 21:37

A California man this month admitted he stole hundreds of thousands of photos and videos from strangers' Apple iCloud accounts to find and share images of nude young women. Chi, using the online name "Icloudripper4you," worked with other unidentified miscreants to obtain files from Apple customers' iCloud accounts by impersonating Apple customer support representatives in email messages.

More on Apple’s iPhone Backdoor
2021-08-20 13:54

In this post, I'll collect links on Apple's iPhone backdoor for scanning CSAM images. Apple says that hash collisions in its CSAM detection system were expected, and not a concern.

Apple's bright idea for CSAM scanning could start 'persecution on a global basis' – 90+ civil rights groups
2021-08-19 19:22

More than ninety human rights groups from around the world have signed a letter condemning Apple's plans to scan devices for child sexual abuse material - and warned Cupertino could usher in "Censorship, surveillance and persecution on a global basis." The US-based Center for Democracy and Technology organised the open letter [PDF], which called on Apple to abandon its approach to mass-scanning.

Apple’s NeuralHash Algorithm Has Been Reverse-Engineered
2021-08-18 16:51

Apple's NeuralHash algorithm - the one it's using for client-side scanning on the iPhone - has been reverse-engineered. Early tests show that it can tolerate image resizing and compression, but not cropping or rotations.

Apple says its CSAM scan code can be verified by researchers. Corellium starts throwing out dollar bills
2021-08-17 22:10

Last week, Apple essentially invited security researchers to probe its forthcoming technology that's supposed to help thwart the spread of known child sexual abuse material. Crucially, Apple repeatedly stated that its claims about its CSAM-scanning software are "Subject to code inspection by security researchers like all other iOS device-side security claims." And its senior veep of software engineering Craig Federighi went on the record to say "Security researchers are constantly able to introspect what's happening in Apple's [phone] software."

Apple: CSAM Image-Detection Backdoor ‘Narrow’ in Scope
2021-08-17 13:58

Privacy groups like the Electronic Frontier Foundation warned that the process of flagging CSAM images essentially narrows the definition of end-to-end encryption to allow client-side access - which essentially means Apple is building a backdoor into its data storage, it said."Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly scoped backdoor is still a backdoor," The EFF said in reaction to the Apple announcement.

Apple's iPhone computer vision has the potential to preserve privacy but also break it completely
2021-08-16 09:27

Too many of these - there's a threshold - and Apple's systems will let Apple staff investigate. In a blog post "Recognizing People in Photos Through Private On-Device Machine Learning" last month, Apple plumped itself up and strutted its funky stuff on how good its new person recognition process is.