langchain-ask-pdf-local
LangChain_PDFChat_Oobabooga
langchain-ask-pdf-local | LangChain_PDFChat_Oobabooga | |
---|---|---|
2 | 4 | |
78 | 66 | |
- | - | |
2.1 | 3.0 | |
about 1 year ago | about 1 year ago | |
Python | Python | |
- | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
langchain-ask-pdf-local
-
Has anyone been able to train their own model on private data?
Now im using this repo https://github.com/wafflecomposite/langchain-ask-pdf-local But with a wrapper to use oobagoba api (so I can benefit from the gpu power)
-
langchain all run locally with gpu using oobabooga
i was doing some testing and manage to use a langchain pdf chat bot with the oobabooga-api, all run locally in my gpu. using this main code langchain-ask-pdf-local with the webui class in oobaboogas-webui-langchain_agent this is the result (100% not my code, i just copy and pasted it) PDFChat_Oobabooga.
LangChain_PDFChat_Oobabooga
-
Ask PDF functionality?
sebaxzero/LangChain_PDFChat_Oobabooga: oobaboga -text-generation-webui implementation of wafflecomposite - langchain-ask-pdf-local (github.com)
-
Langchain and self hosted LLaMA hosted API
Here you can find a way to use Oobabooga API with langchain. https://github.com/sebaxzero/LangChain_PDFChat_Oobabooga
-
Using a local LLM for large-scale text analysis
Maybe turn your data into a pdf or text file, then feed it to https://github.com/imartinez/privateGPT, or use Oobabooga with https://github.com/sebaxzero/LangChain_PDFChat_Oobabooga
-
langchain all run locally with gpu using oobabooga
i was doing some testing and manage to use a langchain pdf chat bot with the oobabooga-api, all run locally in my gpu. using this main code langchain-ask-pdf-local with the webui class in oobaboogas-webui-langchain_agent this is the result (100% not my code, i just copy and pasted it) PDFChat_Oobabooga.
What are some alternatives?
realtime-bakllava - llama.cpp with BakLLaVA model describes what does it see
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
llama-node - Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
private-gpt - Interact with your documents using the power of GPT, 100% privately, no data leaks
InternChat - InternGPT / InternChat allows you to interact with ChatGPT by clicking, dragging and drawing using a pointing device. [Moved to: https://github.com/OpenGVLab/InternGPT]
ctransformers - Python bindings for the Transformer models implemented in C/C++ using GGML library.
InternGPT - InternGPT (iGPT) is an open source demo platform where you can easily showcase your AI models. Now it supports DragGAN, ChatGPT, ImageBind, multimodal chat like GPT-4, SAM, interactive image editing, etc. Try it at igpt.opengvlab.com (支持DragGAN、ChatGPT、ImageBind、SAM的在线Demo系统)
oobaboogas-webui-langchain_agent - Creates an Langchain Agent which uses the WebUI's API and Wikipedia to work
code-llama-for-vscode - Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
LocalAI - :robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
text-generation-inference - Large Language Model Text Generation Inference