-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
alpaca-electron
The simplest way to run Alpaca (and other LLaMA-based local LLMs) on your own computer
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
I found this, which I believe some are trying to get it to work with Turbo Alpaca https://github.com/oobabooga/text-generation-webui It actually supports AMD EOCm 5.4.2 (basically CUDA for AMD GPUS).
I recently tried alpaca electron with the 7b model. I am surprised how well this runs on my own hardware with very little CPU and RAM consumption.