-
LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
you could run motorhead on docker https://github.com/getmetal/motorhead
-
Maybe llama.cpp is what you might need. It doesn't even need a GPU and can run on mobile device.
-
You also need a LLM to do this. Please check this out to pick one up from the llama family. Other works like llama.onnx, alpaca-native and llama model on hugging face are also worth checking.
Related posts
-
Motorhead is a memory and information retrieval server for LLMs
-
PostgresML
-
[P] pgml-chat: A command-line tool for deploying low-latency knowledge-based chatbots
-
Python SDK for PostgresML with scalable LLM embedding memory and text generation
-
[P] Python SDK for PostgresML w/ scalable LLM embedding memory and text generation