Python prompt-injection

Open-source Python projects categorized as prompt-injection

Top 5 Python prompt-injection Projects

  • llm-guard

    The Security Toolkit for LLM Interactions

  • Project mention: llm-guard: The Security Toolkit for LLM Interactions | /r/blueteamsec | 2023-09-19
  • promptmap

    automatically tests prompt injection attacks on ChatGPT instances

  • Project mention: Promptmap – automatically tests prompt injection attacks on ChatGPT instances | news.ycombinator.com | 2023-07-17
  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • aegis

    Self-hardening firewall for large language models (by automorphic-ai)

  • Project mention: Show HN: Firewall for LLMs–Guard Against Prompt Injection, PII Leakage, Toxicity | news.ycombinator.com | 2023-06-28

    Hey HN,

    We're building Aegis, a firewall for LLMs: a guard against adversarial attacks, prompt injections, toxic language, PII leakage, etc.

    One of the primary concerns entwined with building LLM applications is the chance of attackers subverting the model’s original instructions via untrusted user input, which unlike in SQL injection attacks, can’t be easily sanitized. (See https://greshake.github.io/ for the mildest such instance.) Because the consequences are dire, we feel it’s better to err on the side of caution, with something mutli-pass like Aegis, which consists of a lexical similarity check, a semantic similarity check, and a final pass through an ML model.

    We'd love for you to check it out—see if you can prompt inject it!, and give any suggestions/thoughts on how we could improve it: https://github.com/automorphic-ai/aegis.

    If you want to play around with it without creating an account, try the playground: https://automorphic.ai/playground.

    If you're interested in or need help using Aegis, have ideas, or want to contribute, join our [Discord](https://discord.com/invite/E8y4NcNeBe), or feel free to reach out at [email protected]. Excited to hear your feedback!

    Repository: https://github.com/automorphic-ai/aegis

  • Prompt-Injection-Testing-Tool

    The Prompt Injection Testing Tool is a Python script designed to assess the security of your AI system's prompt handling against a predefined list of user prompts commonly used for injection attacks. This tool utilizes the OpenAI GPT-3.5 model to generate responses to system-user prompt pairs and outputs the results to a CSV file for analysis.

  • Project mention: Show HN: Prompt Injection Testing Tool – GitHub | news.ycombinator.com | 2024-03-20
  • raccoon

    Let Raccoon sample the unknown, safeguarding your AI's home. (by velocitatem)

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The open source projects on this list are ordered by number of github stars. The number of mentions indicates repo mentiontions in the last 12 Months or since we started tracking (Dec 2020).

Python prompt-injection related posts

  • Show HN: Firewall for LLMs–Guard Against Prompt Injection, PII Leakage, Toxicity

    1 project | news.ycombinator.com | 28 Jun 2023
  • We’ve built a free firewall for LLMs (Aegis) — Say goodbye to prompt injections, prompt leakage, and toxic language (100+ stars)

    1 project | /r/ChatGPTPro | 28 Jun 2023
  • Try your best prompts—especially prompt injections—against Aegis, our firewall for LLMs

    1 project | /r/GPT_jailbreaks | 28 Jun 2023

Index

What are some of the best open-source prompt-injection projects in Python? This list will help you:

Project Stars
1 llm-guard 886
2 promptmap 526
3 aegis 243
4 Prompt-Injection-Testing-Tool 19
5 raccoon 6

Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com