Security News > 2023 > April > Can ChatGPT bash together some data-stealing code? With the right prompts, sure

Can ChatGPT bash together some data-stealing code? With the right prompts, sure
2023-04-04 22:00

A Forcepoint staffer has blogged about how he used ChatGPT to craft some code that exfiltrates data from an infected machine.

Mulgrew says producing the tool took "Only a few hours." His write-up on Tuesday of his experimentation can be found here, though ignore the stuff about zero days and how the bot could write code that would take normal programmers days to do.

"Combing the snippets using a prompt was surprisingly the easiest part, as I simply needed to post the code snippets I had managed to get ChatGPT to generate and combine them together," he wrote.

Since most high-value documents worth stealing will likely be larger than 1MB, Mulgrew asked ChatGPT to write code to split a PDF into 100KB pieces, and insert each chunk into its own PNG, which would all be exfiltrated into the attacker's cloud storage.

With some tweaks - such as asking ChatGPT to delay the start time of the program by two minutes, which fools some AV tools - and other massaging, such as obfuscating the code, he was eventually able to get the program through VirusTotal without any alarms going off, or so we're told.

Again, ChatGPT recognizes commands such as "Obfuscate the code to avoid detection" as unethical and blocks them, so would-be attackers would have to get creative with their input prompts.


News URL

https://go.theregister.com/feed/www.theregister.com/2023/04/04/chatgpt_exfiltration_tool/