Security News > 2024 > October > Cast a hex on ChatGPT to trick the AI into writing exploit code
2024-10-29 22:30
'It was like watching a robot going rogue' says researcher OpenAI's language model GPT-4o can be tricked into writing exploit code by encoding the malicious instructions in hexadecimal, which allows an attacker to jump the model's built-in security guardrails and abuse the AI for evil purposes, according to 0Din researcher Marco Figueroa.…
News URL
https://go.theregister.com/feed/www.theregister.com/2024/10/29/chatgpt_hex_encoded_jailbreak/