plunkylib
TalkWithYourFiles
plunkylib | TalkWithYourFiles | |
---|---|---|
7 | 7 | |
12 | 86 | |
- | - | |
5.1 | 7.9 | |
10 months ago | 6 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
plunkylib
-
OpenAI will discontinue support for their Codex API
If the chat approach annoys you, you can use libraries like plunkylib which uses more convenient yaml and text files syntax for coordinating the queries. Langchain is another great library that can help abstract that away for you as well.
-
GPTerm - Chatting with OpenAI's GPT-3 Models on your terminal
Really cool! Similarly, you can use https://github.com/Mattie/plunkylib in CLI mode as well (or notebook). Storing your prompts in reusable/expandable text files.
- Plunkylib -- Friendly Python library/notebooks for using txt/yaml for convenient prompt reuse
-
What framework(s) are you good folks using to make web-based interfaces for GPT3 (& others)?
FWIW, I made a Python library that at least helps you iterate on language parameters and prompts using yaml/flat text rather than hard-coding all that. Nothing special, but it's great if you do a lot of adjusting of prompts/parameters for lots of different uses (and for different engines). https://github.com/Mattie/plunkylib Definitely great to use for bots/engines (as I use it for a number already).
-
GPT3/DALL-E2 Discord bot with medium/long term memory!
Rather than hardcode your params/tokens/prompts, I have a library called plunkylib that makes it easier to define all of that in txt/yaml files so you can mix/match/adjust them easier than changing the code itself. Was noticing things like your summary prompt and some others are in the code-- might be better to have it in parameterized external files.
-
Obsidian notes as core parameters + GPT3 = digital me
I released a library called Plunkylib which lets you use text files more comfortably for working with the API and you may find it more pleasant to work with or build your own interactive use cases.
TalkWithYourFiles
-
Using Streamlit to upload multiple files to interact with Langchain
For example, I'm using st.file_uploader to manage uploading multiple files & I'm processing that with langchain in this project and I don't require any directories?
- Talk With Your Files - Open Source LLM-GUI project with Langchain & Streamlit
- Feedback for my open source project? Talk With Your Files LLM-GUI app
What are some alternatives?
obsidian-gpt - Obsidian plugin for getting language model completions from GPT-3, ChatGPT, Cohere, and others
snowChat - Chat snowflake database - Text to SQL
cog-stable-diffusion - Diffusers Stable Diffusion as a Cog model
pansophy - knowledge graph with LLMs
Helix - Engineering Consciousness
repochat - Chatbot assistant enabling GitHub repository interaction using LLMs with Retrieval Augmented Generation
GPT3Discord - MOVED
semchunk - A fast and lightweight pure Python library for splitting text into semantically meaningful chunks.
streamlit-langchain-chatbot-bedrock-redis-memory - Build a Streamlit app with LangChain and Amazon Bedrock - Use ElastiCache Serverless Redis for chat history, deploy to EKS and manage permissions with EKS Pod Identity