-
Scout Monitoring
Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
-
chat-with-doc
A Web application that enables users to upload documents and engage in conversational analysis with self-hosted large language models (LLMs) deployed on Ollama.
-
jan
Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)
-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
llm_steer-oobabooga
Steer LLM outputs towards a certain topic/subject and enhance response capabilities using activation engineering by adding steering vectors, now in oobabooga text generation webui!
Ollama is really easy.
brew install ollama
brew services start ollama
ollama pull mistral
Ollama has an http interface that provides a consistent interface for prompting, regardless of model.
https://github.com/ollama/ollama/blob/main/docs/api.md#reque...
Some of the tools offer a path to doing tool use (fetching URLs and doing things with them) or RAG (searching your documents). I think Oobabooga https://github.com/oobabooga/text-generation-webui offers the latter through plugins.
Our tool, https://github.com/transformerlab/transformerlab-app also supports the latter (document search) using local llms.
If you're able to purchase a separate GPU, the most popular option is to get an NVIDIA RTX3090 or RTX4090.
Apple Mac M2 or M3's are becoming a viable option because of MLX https://github.com/ml-explore/mlx . If you are getting an M series Mac for LLMs, I'd recommend getting something with 24GB or more of RAM.
Some of the tools offer a path to doing tool use (fetching URLs and doing things with them) or RAG (searching your documents). I think Oobabooga https://github.com/oobabooga/text-generation-webui offers the latter through plugins.
Our tool, https://github.com/transformerlab/transformerlab-app also supports the latter (document search) using local llms.
I agree with you re:extensions. I know something of the extension ecosystem for it - I built one - https://github.com/Hellisotherpeople/llm_steer-oobabooga
Oobabooga the closest thing we have to maximalism. It has exposure for by far the largest number of parameters/settings/backends compared to all others.
My main point is that the world yearns for a proper "Photoshop for text" - and no one has even tried to make this (closest is oobabooga). All VC backed competitors are not even close to the mark on what they should be doing here.
Paywall article: https://towardsdatascience.com/how-to-build-a-local-open-sou...
Source code: https://github.com/leoneversberg/llm-chatbot-rag