Security News > 2023 > September > UK’s NCSC Warns Against Cybersecurity Attacks on AI
The National Cyber Security Centre provides details on prompt injection and data poisoning attacks so organizations using machine-learning models can mitigate the risks.
Large language models used in artificial intelligence, such as ChatGPT or Google Bard, are prone to different cybersecurity attacks, in particular prompt injection and data poisoning.
AIs are trained not to provide offensive or harmful content, unethical answers or confidential information; prompt injection attacks create an output that generates those unintended behaviors.
Prompt injection attacks work the same way as SQL injection attacks, which enable an attacker to manipulate text input to execute unintended queries on a database.
A less dangerous prompt injection attack consists of having the AI provide unethical content such as using bad or rude words, but it can also be used to bypass filters and create harmful content such as malware code.
Prompt injection attacks may also target the inner working of the AI and trigger vulnerabilities in its infrastructure itself.
News URL
https://www.techrepublic.com/article/uks-ncsc-warns-against-cybersecurity-attacks-on-ai/
Related news
- Microsoft Fixes AI, Cloud, and ERP Security Flaws; One Exploited in Active Attacks (source)
- A Guide to Securing AI App Development: Join This Cybersecurity Webinar (source)
- Treat AI like a human: Redefining cybersecurity (source)
- Shape the future of UK cyber security (source)
- US sanctions Chinese cybersecurity company for firewall compromise, ransomware attacks (source)
- US Sanctions Chinese Cybersecurity Firm for 2020 Ransomware Attack (source)
- The sixth sense of cybersecurity: How AI spots threats before they strike (source)
- New AI Jailbreak Method 'Bad Likert Judge' Boosts Attack Success Rates by Over 60% (source)
- Cybersecurity in 2025: Global conflict, grown-up AI, and the wisdom of the crowd (source)
- Preventing the next ransomware attack with help from AI (source)