Security News > 2024 > March > In the rush to build AI apps, please, please don't leave security behind

In the rush to build AI apps, please, please don't leave security behind
2024-03-17 11:04

Code components available from public repositories can contain hidden backdoors or data exfiltrators, and pre-built models and datasets can be poisoned to cause apps to behave unexpectedly inappropriately.

Backdoored or malware-spiked libraries and models, if incorporated into shipped software, could leave users of those apps open to attack as well.

"If you think of a pie chart of how you're gonna get hacked once you open up an AI department in your company or organization," Dan McInerney, lead AI security researcher at Protect AI, told The Register, "a tiny fraction of that pie is going to be model input attacks, which is what everyone talks about. And a giant portion is going to be attacking the supply chain - the tools you use to build the model themselves."

How to weaponize LLMs to hijack websites Google open sources file-identifying Magika AI for malware hunters and others California proposes government cloud cluster to sift out nasty AI models OpenAI shuts down China, Russia, Iran, N Korea accounts caught doing naughty things.

"So in theory, an attacker could have submitted any change to any repository and made it look like it came from Hugging Face, and a security update could have fooled them into accepting it. People would have just had backdoored models or insecure models in their repos and wouldn't know."

Trying to beef up security in the AI supply chain is tricky, and with so many tools and models being built and released, it's difficult to keep up.


News URL

https://go.theregister.com/feed/www.theregister.com/2024/03/17/ai_supply_chain/