Security News > 2021 > August > Apple's iPhone computer vision has the potential to preserve privacy but also break it completely
Too many of these - there's a threshold - and Apple's systems will let Apple staff investigate.
In a blog post "Recognizing People in Photos Through Private On-Device Machine Learning" last month, Apple plumped itself up and strutted its funky stuff on how good its new person recognition process is.
It's awfully science-y. The post is 3,500 words long, complex, and a very detailed paper on computer vision, one of the two tags Apple has given it.
We'd know better if Apple were to put out a 3,500-word paper discussing these issues, its capabilities, intentions and safeguards, and the metrics by which such a major step forward in privacy intervention as the new CSAM detection can be judged a success.
How many people will it catch? How many people could it endanger if different aspects of it were tweaked or subverted by state mandate or systematic failure? Talk to us, Apple.
Until we get discussions of privacy, AI and ecosystem design, to the same depth as Apple is happy talking to us about how good its camera app is, we can't tell.
News URL
https://go.theregister.com/feed/www.theregister.com/2021/08/16/ai_vision_apple/