-
Open-Assistant
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
petals
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
-
serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
https://open-assistant.io/ is one of many open source LLMs. Some of them can even run locally on your home computer.
The good news is that people figured out how to run those models on AMD and MacOS. Idk how's the performance and what are limitations, but you can test it yourself if you have such hardware: https://github.com/oobabooga/text-generation-webui
you’re right about gpt but don’t be fooled, there have been very promising results that run locally !!
Might wanna look into Petals