Prompt Injection
Prompt Injection is a security vulnerability in AI systems, particularly large language models (LLMs). It occurs when an attacker manipulates model behavior by inserting malicious instructions into user input. As a result, the model may bypass intended restrictions, leak information,...