Security News > 2023 > September > UK’s NCSC Warns Against Cybersecurity Attacks on AI

The National Cyber Security Centre provides details on prompt injection and data poisoning attacks so organizations using machine-learning models can mitigate the risks.
Large language models used in artificial intelligence, such as ChatGPT or Google Bard, are prone to different cybersecurity attacks, in particular prompt injection and data poisoning.
AIs are trained not to provide offensive or harmful content, unethical answers or confidential information; prompt injection attacks create an output that generates those unintended behaviors.
Prompt injection attacks work the same way as SQL injection attacks, which enable an attacker to manipulate text input to execute unintended queries on a database.
A less dangerous prompt injection attack consists of having the AI provide unethical content such as using bad or rude words, but it can also be used to bypass filters and create harmful content such as malware code.
Prompt injection attacks may also target the inner working of the AI and trigger vulnerabilities in its infrastructure itself.
News URL
https://www.techrepublic.com/article/uks-ncsc-warns-against-cybersecurity-attacks-on-ai/
Related news
- CrowdStrike Security Report: Generative AI Powers Social Engineering Attacks (source)
- How New AI Agents Will Transform Credential Stuffing Attacks (source)
- YouTube warns of AI-generated video of its CEO used in phishing attacks (source)
- Can AI-powered gamified simulations help cybersecurity teams keep up? (source)
- MINJA sneak attack poisons AI models for other chatbot users (source)
- New ‘Rules File Backdoor’ Attack Lets Hackers Inject Malicious Code via AI Code Editors (source)
- ⚡ THN Weekly Recap: GitHub Supply Chain Attack, AI Malware, BYOVD Tactics, and More (source)
- AI-Powered SaaS Security: Keeping Pace with an Expanding Attack Surface (source)
- EU invests €1.3 billion in AI and cybersecurity (source)
- 3 Ways the UK Government Plans to Tighten Cyber Security Rules with New Bill (source)