Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
dockerLLM Alternatives
Similar projects and alternatives to dockerLLM
-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
open_llama
OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
dockerLLM reviews and mentions
-
Local VS Cloud?
I use Runpod now. TheBloke provides some templates that make it easy to start: https://github.com/TheBlokeAI/dockerLLM/.
-
Free LLM api
or use TheBloke Local LLMs One-Click UI template on runpod https://github.com/TheBlokeAI/dockerLLM/blob/main/README_Runpod_LocalLLMsUIandAPI.md
- oobabooga Update broke loading u/The-Bloke huggingface models?
-
"Samantha-33B-SuperHOT-8K-GPTQ" now that's a great name for a true model.
The one thing I have published is my Docker files for producing my two Runpod templates, which let people try GGML and GPTQ models on Runpod pods with full GPU acceleration (ExLlama and AutoGPTQ). They can be found at https://github.com/TheBlokeAI/dockerLLM/ .
-
OpenLLaMA 13B Released
https://www.runpod.io/console/templates
This is the readme for the one I mentioned: https://github.com/TheBlokeAI/dockerLLM/blob/main/README_Run...
> can I use Colab/Huggingface GPUs?
You use these templates on the runpod platform itself. Theres no free tier equivalent like you have with Colab/HF, but currently you can rent an RTX 4090 for $0.69/hr so its pretty affordable.
-
A note from our sponsor - InfluxDB
www.influxdata.com | 15 May 2024
Stats
TheBlokeAI/dockerLLM is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of dockerLLM is Shell.
Sponsored