Security News > 2024 > December > Jailbreaking LLM-Controlled Robots

Jailbreaking LLM-Controlled Robots
2024-12-11 12:02

Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.


News URL

https://www.schneier.com/blog/archives/2024/12/jailbreaking-llm-controlled-robots.html