-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
OK, looks like the "LLM retard guide" is "run this installer": https://github.com/oobabooga/text-generation-webui/releases/...
That's the only thing that worked for me, but it was so easy and worked instantly.
Copy paste the parts of the install steps for LLaMA C++ into ChatGPT and ask it to explain things simply and include any prerequisite steps you might need to do. If you get stuck, just ask ChatGPT and include any error messages.
https://github.com/ggerganov/llama.cpp