aiconfig
streamlit_apps
aiconfig | streamlit_apps | |
---|---|---|
29 | 1 | |
867 | 0 | |
6.1% | - | |
9.7 | 8.3 | |
7 days ago | 5 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
aiconfig
-
VS Code: Prompt Editor for LLMs (GPT4, Llama, Mistral, etc.)
doesn't collect prompts and there's a way to disable telemetry as well - https://github.com/lastmile-ai/aiconfig/blob/8a5a59d47cef474...
-
Show HN: Gradio Notebook– Notebook UX for Any Generative AI in Hugging Face
You can! We artificially limited that in the gradio notebook component right now, but if you check out the core package it’s built on (aiconfig), it shows how to use OpenAI, Claude, and other model providers:
https://github.com/lastmile-ai/aiconfig
Here’s the currently supported list of models: https://aiconfig.lastmileai.dev/docs/overview/model-parsers
Let me know if we should enable non-HF models in the gradio notebook component by default as well — it should be a simple change
- Gradio Notebook – Generative AI Notebook Interface for Hugging Face Spaces
-
Prompt Routing with Zeroshot Technique -AiConfig
AiConfig Web AiConfig Open Source
-
Trend Detection and Analysis with the AiConfig
One of the primary reasons for using the AiConfig is its ease to use and highly configurable nature of prompts and its configurations, including the configurable LLM usages etc.
-
Playground for Generative AI
Check out prompt templates and the AIConfig framework here: https://github.com/lastmile-ai/aiconfig
-
Master Prompt Engineering with OpenAI 🧠
Please give our AIConfig repo a star to support the project! https://github.com/lastmile-ai/aiconfig
-
Show HN: Microagents: Agents Capable of Self-Editing Their Prompts / Python Code
Take a look at https://github.com/lastmile-ai/aiconfig for refactoring your prompt management - https://github.com/aymenfurter/microagents/tree/main/prompt_...
-
Master LLM Hallucinations 💭
Show your support by starring our project on GitHub! ⭐️ https://github.com/lastmile-ai/aiconfig
-
Harness the power of multiple LLMs
⭐️ We recently launched AIConfig as our first open-source project. Please support us with a star ⭐️ on Github! https://github.com/lastmile-ai/aiconfig/tree/main
streamlit_apps
-
Master Prompt Engineering with OpenAI 🧠
🔗 Prompt Templates (AIConfig) 🔗 AIConfig Repo
What are some alternatives?
openai-node - The official Node.js / Typescript library for the OpenAI API
llama-retrieval-plugin - LLaMa retrieval plugin script using OpenAI's retrieval plugin
speech-ai - A Python package that generates conversational speech from text using a combination of Generative AI models and various text-to-speech engines, enabling applications to produce dynamic and contextually aware spoken responses.
LLMStack - No-code multi-agent framework to build LLM Agents, workflows and applications with your data
Stable-Diffusion-Latent-Space-Explorer - Codebase for performing various experiments with Stable Diffusion, supported by the diffusers library.
microagents - Agents Capable of Self-Editing Their Prompts / Python Code
SlothAI - A simple, but deceptively fast document pipeline manager for AI. Runs Python on AppEngine.
llm-client-sdk - SDK for using LLM
OpenAI_Agent_Swarm - HAAS = Hierarchical Autonomous Agent Swarm - "Resistance is futile!"
YiVal - Your Automatic Prompt Engineering Assistant for GenAI Applications
llmware - Unified framework for building enterprise RAG pipelines with small, specialized models