Security News > 2024 > March > AI hallucinates software packages and devs download them – even if potentially poisoned with malware

AI hallucinates software packages and devs download them – even if potentially poisoned with malware
2024-03-28 07:01

According to Bar Lanyado, security researcher at Lasso Security, one of the businesses fooled by AI into incorporating the package is Alibaba, which at the time of writing still includes a pip command to download the Python package huggingface-cli in its GraphTranslator installation instructions.

Lanyado did so to explore whether these kinds of hallucinated software packages - package names invented by generative AI models, presumably during project development - persist over time and to test whether invented package names could be co-opted and used to distribute malicious code by writing actual packages that use the names of code dreamed up by AIs.

Last year, through security firm Vulcan Cyber, Lanyado published research detailing how one might pose a coding question to an AI model like ChatGPT and receive an answer that recommends the use of a software library, package, or framework that doesn't exist.

"When an attacker runs such a campaign, he will ask the model for packages that solve a coding problem, then he will receive some packages that don't exist," Lanyado explained to The Register.

As it turns out, generative AI models will do the same for software packages.

The attacker needs the AI model to repeat the names of hallucinated packages in its responses to users for malware created under those names to be sought and downloaded.


News URL

https://go.theregister.com/feed/www.theregister.com/2024/03/28/ai_bots_hallucinate_software_packages/