Hacker News
new
top
best
ask
show
job
Ask HN: What would be the impact of a LLM output injection attack?
3 points
by
subw00f
3 hours ago
1 comment
mavdol04
3 hours ago
The worst that could happen is having your credentials stolen. It’s an LLM architectural flaw, so it has to be at the tools level so the only way to prevent it is still sandboxing in my opinion. Or at least sandboxing the tools themselves