Security News > 2024 > October > 20% of Generative AI ‘Jailbreak’ Attacks Succeed, With 90% Exposing Sensitive Data

2024-10-09 16:29
On average, it takes adversaries just 42 seconds and five interactions to execute a GenAI jailbreak, according to Pillar Security.
News URL
https://www.techrepublic.com/article/genai-jailbreak-report-pillar-security/
Related news
- CrowdStrike Security Report: Generative AI Powers Social Engineering Attacks (source)
- How New AI Agents Will Transform Credential Stuffing Attacks (source)
- YouTube warns of AI-generated video of its CEO used in phishing attacks (source)
- MINJA sneak attack poisons AI models for other chatbot users (source)
- New ‘Rules File Backdoor’ Attack Lets Hackers Inject Malicious Code via AI Code Editors (source)
- ⚡ THN Weekly Recap: GitHub Supply Chain Attack, AI Malware, BYOVD Tactics, and More (source)
- AI-Powered SaaS Security: Keeping Pace with an Expanding Attack Surface (source)