-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
koboldcpp-rocm
AI Inferencing at the Edge. A simple one-file way to run various GGML models with KoboldAI's UI with AMD ROCm offloading
Llama.cpp has its own flake, and I've had good luck getting it to build on both Arch (with rocm) and macOS (with metal). Just clone, cd into the directory, and nix build. Then you can use the llama and llama-server binaries in result/bin.
I'm (whenever I find the time) maintaining a gpt4all Flake for NixOS: https://github.com/polygon/gpt4all-nix/
Related posts
-
Ask HN: Are there AI prompt builder products?
-
Washington's Lottery forced to pull site after creating AI porn of lotto user
-
Now You Can Full Fine Tune / DreamBooth Stable Diffusion XL (SDXL) with only 10.3 GB VRAM via OneTrainer
-
Compressing Images with Neural Networks
-
Full AMD Linux Laptop (Radeon 7600M XT GPU, Ryzen CPU): Tuxedo Sirius 16 Review