A resource for understanding prompt injection and why it matters.
🌐 Live site: https://promptinjection.wtf
Prompt injection is when AI systems can be tricked into ignoring their instructions. It's a fundamental security flaw that every major AI system currently has, and there's no clear fix.
This site aims to:
- Explain the problem in plain English
- Track news and developments
- Eventually: Build demonstrations that make the danger visceral
- Add news items or research papers via PR
- Improve explanations
- Share examples (responsibly)
Join the Discord for coordination
MIT - This is a community resource.
