Security News > 2023 > December > Data Exfiltration Using Indirect Prompt Injection

Data Exfiltration Using Indirect Prompt Injection
2023-12-22 12:05

In Writer, users can enter a ChatGPT-like session to edit or create their documents.

In this chat session, the LLM can retrieve information from sources on the web to assist users in creation of their documents.

We show that attackers can prepare websites that, when a user adds them as a source, manipulate the LLM into sending private information to the attacker or perform other malicious activities.

The data theft can include documents the user has uploaded, their chat history or potentially specific private information the chat model can convince the user to divulge at the attacker's behest.


News URL

https://www.schneier.com/blog/archives/2023/12/data-exfiltration-using-indirect-prompt-injection.html