HostileShop: LLM Prompt Injection and Security Framework

Hackers

Hackers

7 followers

time
2 months ago
view
2 views

An overview of HostileShop, a tool for generating prompt injections and testing security vulnerabilities in LLM agents, including attack methods and jailbreak mutation techniques.

Loading comments...
affpapa
sigma-africa
sigma-asia
sigma-europe
GamesSportsStreams