A llama.cpp drop-in replacement for OpenAI's GPT endpoints, allowing GPT-powered apps to run off local llama.cpp models instead of OpenAI.
Why do you think that https://github.com/oobabooga/text-generation-webui is a good alternative to gpt-llama.cpp