promptmap: automatically tests prompt injection attacks on ChatGPT instances

promptmap Prompt injection is a type of security vulnerability that can be exploited to control the behavior of a ChatGPT instance. By injecting malicious prompts into the system, an attacker can force the ChatGPT...