-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
AlpacaChat
A Swift library that runs Alpaca prediction locally to implement ChatGPT like app on Apple platform devices.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
For the people on laptops or desktops, there’s already another tool called Dalai that runs the LLaMa and Alpaca models (up to 65B) on CPU and can run on M1 MacBooks (and other weaker machines - Mac, Windows, and Linux). And Oobabooga can run them on Nvidia GPUs. r/LocalLlama has more info on all this
Thanks to 4-bit quantization, you can already run Alpaca 7B (and presumably LLaMa 7B) on an iPhone with AlpacaChat, though it’s currently quite slow.
For the people on laptops or desktops, there’s already another tool called Dalai that runs the LLaMa and Alpaca models (up to 65B) on CPU and can run on M1 MacBooks (and other weaker machines - Mac, Windows, and Linux). And Oobabooga can run them on Nvidia GPUs. r/LocalLlama has more info on all this
ofc, and it takes 5 minutes to install(depends on your internet) https://github.com/AUTOMATIC1111/stable-diffusion-webui