What Is a Prompt Injection Attack? [Examples & Prevention]
A prompt injection attack is a GenAI security threat where an attacker deliberately crafts and inputs deceptive text into a large language model (LLM) to manipulate its outputs.