Security News > 2024 > February > Rise of deepfake threats means biometric security measures won't be enough
Cyber attacks using AI-generated deepfakes to bypass facial biometrics security will lead a third of organizations to doubt the adequacy of identity verification and authentication tools as standalone protections.
Remote account recovery, for example, might rely on an image of the individual's face to unlock security.
Since these could be beaten by images copied from social media and other sources, security systems employed "Liveness detection" to see if the request was from the right individual.
These approaches could now be duped by AI deepfakes and need to be supplemented by additional layers of security, Gartner's VP Analyst Akif Khan told The Register.
Security system developers are also trying to use AI - typically deep neural networks - to inspect the presented images to look for signs that they are deepfakes.
Organizations should use both approaches to defend against deepfake threats to biometric security, he said.
News URL
https://go.theregister.com/feed/www.theregister.com/2024/02/01/deepfake_threat_biometrics/
Related news
- Eliminating AI Deepfake Threats: Is Your Identity Security AI-Proof? (source)
- Obsidian Security Warns of Rising SaaS Threats to Enterprises (source)
- AWS security essentials for managing compliance, data protection, and threat detection (source)
- Privileged Accounts, Hidden Threats: Why Privileged Access Security Must Be a Top Priority (source)