Security News > 2023 > September > UK’s NCSC Warns Against Cybersecurity Attacks on AI
The National Cyber Security Centre provides details on prompt injection and data poisoning attacks so organizations using machine-learning models can mitigate the risks.
Large language models used in artificial intelligence, such as ChatGPT or Google Bard, are prone to different cybersecurity attacks, in particular prompt injection and data poisoning.
AIs are trained not to provide offensive or harmful content, unethical answers or confidential information; prompt injection attacks create an output that generates those unintended behaviors.
Prompt injection attacks work the same way as SQL injection attacks, which enable an attacker to manipulate text input to execute unintended queries on a database.
A less dangerous prompt injection attack consists of having the AI provide unethical content such as using bad or rude words, but it can also be used to bypass filters and create harmful content such as malware code.
Prompt injection attacks may also target the inner working of the AI and trigger vulnerabilities in its infrastructure itself.
News URL
https://www.techrepublic.com/article/uks-ncsc-warns-against-cybersecurity-attacks-on-ai/
Related news
- One-Third of UK Teachers Lack Cybersecurity Training, While 34% Experience Security Incidents (source)
- UK nuclear site Sellafield fined $440,000 for cybersecurity shortfalls (source)
- 20% of Generative AI ‘Jailbreak’ Attacks Succeed, With 90% Exposing Sensitive Data (source)
- What lies ahead for AI in cybersecurity (source)
- Cybersecurity Awareness Lags as Global Workforce Engages in Risky AI Practices (source)
- 99% of UK Businesses Faced Cyber Attacks in the Last Year (source)
- From Misuse to Abuse: AI Risks and Attacks (source)
- AI-Assisted Attacks Top Cyber Threat For Third Consecutive Quarter, Gartner Finds (source)
- Google Cloud Cybersecurity Forecast 2025: AI, geopolitics, and cybercrime take centre stage (source)
- Using AI to drive cybersecurity risk scoring systems (source)