LangChain_PDFChat_Oobabooga
oobaboogas-webui-langchain_agent
LangChain_PDFChat_Oobabooga | oobaboogas-webui-langchain_agent | |
---|---|---|
4 | 2 | |
66 | 73 | |
- | - | |
3.0 | 5.0 | |
about 1 year ago | 9 months ago | |
Python | Python | |
- | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
LangChain_PDFChat_Oobabooga
-
Ask PDF functionality?
sebaxzero/LangChain_PDFChat_Oobabooga: oobaboga -text-generation-webui implementation of wafflecomposite - langchain-ask-pdf-local (github.com)
-
Langchain and self hosted LLaMA hosted API
Here you can find a way to use Oobabooga API with langchain. https://github.com/sebaxzero/LangChain_PDFChat_Oobabooga
-
Using a local LLM for large-scale text analysis
Maybe turn your data into a pdf or text file, then feed it to https://github.com/imartinez/privateGPT, or use Oobabooga with https://github.com/sebaxzero/LangChain_PDFChat_Oobabooga
-
langchain all run locally with gpu using oobabooga
i was doing some testing and manage to use a langchain pdf chat bot with the oobabooga-api, all run locally in my gpu. using this main code langchain-ask-pdf-local with the webui class in oobaboogas-webui-langchain_agent this is the result (100% not my code, i just copy and pasted it) PDFChat_Oobabooga.
oobaboogas-webui-langchain_agent
-
langchain all run locally with gpu using oobabooga
i was doing some testing and manage to use a langchain pdf chat bot with the oobabooga-api, all run locally in my gpu. using this main code langchain-ask-pdf-local with the webui class in oobaboogas-webui-langchain_agent this is the result (100% not my code, i just copy and pasted it) PDFChat_Oobabooga.
- Got a Oobabooga to become an agent using lanchain and it's own API
What are some alternatives?
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
langchain-ask-pdf-local - An AI-app that allows you to upload a PDF and ask questions about it. It uses StableVicuna 13B and runs locally.
private-gpt - Interact with your documents using the power of GPT, 100% privately, no data leaks
ctransformers - Python bindings for the Transformer models implemented in C/C++ using GGML library.
LocalAI - :robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
text-generation-inference - Large Language Model Text Generation Inference