Posts
All the articles I've posted.
Prompt Injection versus Jailbreaking Definitions
Updated: at 04:32 AMSimon Willison distinguishes between prompt injection and jailbreaking in attacks against applications using Large Language Models (LLMs) based on their methods of subverting the systems' safety filters.
Prompt 3
Updated: at 06:17 AMExcitement for the release of a new version of the Prompt by Panic SSH app despite reservations about its subscription price.
It's OK to Abandon Your Side-Project
Updated: at 04:59 AMThe article discusses the challenge of letting go of side projects that are no longer needed and highlights how creating a project can inadvertently help in learning new concepts or skills.