-
promptfoo
Test your prompts, models, and RAGs. Catch regressions and improve prompt quality. LLM evals for OpenAI, Azure, Anthropic, Gemini, Mistral, Llama, Bedrock, Ollama, and other local & private models with CI/CD integration.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
I'm building a consumer facing prompt manager (https://promptfolder.com/ - no API yet), but I've looked at most of the other tools... unfortunately I haven't seen anything that is a solid fit for what you're looking for.
You might want to contact the devs behind PromptPal (https://github.com/PromptPal/PromptPal), they seem to be some of the more technically advanced and fast moving people building in this area... maybe they can hack something together for you.
AIPRM is one of the more popular prompt managers, but as of May they don't have a public API (https://forum.aiprm.com/t/use-aiprm-prompt-templates-api/382...).
Maybe you can try https://flows.network/, which supports environment variables when creating an app. You can also manage the environment variables later with their UI. Click on https://flows.network/flow/createByTemplate/Telegram-ChatGPT, you can see you could set up system_prompt.
Related posts
-
Rust boosts LLM app development: Make a serverless Japanese Learning bot in mins
-
Create your own ChatGPT bot (in Rust) to review & summarize GitHub Pull Request.
-
Create and deploy your own ChatGPT bot to review & summarize GitHub Pull Request.
-
Rust API for OpenAI workflows. Enable ChatGPT on your own GitHub repo to review & summarize Pull Request.
-
A ChatGPT Bot (in Rust) to Review and Summarize GitHub Pull Request