Tag: prompt
All the articles with the tag "prompt".
Prompt Injection versus Jailbreaking Definitions
Updated: at 04:32 AMSimon Willison distinguishes between prompt injection and jailbreaking in attacks against applications using Large Language Models (LLMs) based on their methods of subverting the systems' safety filters.