text-generation-webui

A Gradio web UI for Large Language Models with support for multiple inference backends. (by oobabooga)

Text-generation-webui Alternatives

Similar projects and alternatives to text-generation-webui

  1. SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
  2. llama.cpp

    LLM inference in C/C++

  3. ollama

    Get up and running with Llama 3.3, Phi 4, Gemma 2, and other large language models.

  4. Open-Assistant

    OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.

  5. koboldcpp

    Run GGUF models easily with a KoboldAI UI. One File. Zero Install.

  6. KoboldAI-Client

    For GGUF support, see KoboldCPP: https://github.com/LostRuins/koboldcpp

  7. llama

    Inference code for Llama models

  8. gpt4all

    GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.

  9. private-gpt

    Interact with your documents using the power of GPT, 100% privately, no data leaks

  10. stanford_alpaca

    Code and documentation to train Stanford's Alpaca models, and generate the data.

  11. alpaca-lora

    107 text-generation-webui VS alpaca-lora

    Instruct-tune LLaMA on consumer hardware

  12. alpaca.cpp

    Discontinued Locally run an Instruction-Tuned Chat-Style LLM

  13. mlc-llm

    Universal LLM Deployment Engine with ML Compilation

  14. FastChat

    An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.

  15. SillyTavern

    LLM Frontend for Power Users.

  16. GPTQ-for-LLaMa

    4 bits quantization of LLaMA using GPTQ

  17. exllama

    A more memory-efficient rewrite of the HF transformers implementation of Llama for use with quantized weights.

  18. dalai

    The simplest way to run LLaMA on your local machine

  19. llama-cpp-python

    Python bindings for llama.cpp

  20. serge

    A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better text-generation-webui alternative or higher similarity.

text-generation-webui discussion

Log in or Post with
  1. User avatar
    a54b22e1
    · 7 months ago
    · Reply

    Review ★★★★★ 10/10

  2. User avatar
    ChoiceBANKsampleritgithub
    · 7 months ago
    · Reply

    Review ★★★★★ 10/10

text-generation-webui reviews and mentions

Posts with mentions or reviews of text-generation-webui. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-12-29.
  • I Run LLMs Locally
    5 projects | news.ycombinator.com | 29 Dec 2024
    Still nothing better than oobabooga (https://github.com/oobabooga/text-generation-webui) in terms of maximalism/"Pro"/"Prosumer" LLM UI/UX ALA Blender, Photoshop, Final Cut Pro, etc.

    Embarrassing and any VCs reading this can contact me to talk about how to fix that. lm-studio is today the closest competition (but not close enough) and Adobe or Microsoft could do it if they fired their current folks which prevent this from happening.

    If you're not using Oobabooga, you're likely not playing with the settings on models, and if you're not playing with your models settings, you're hardly even scratching the surface on its total capabilities.

  • Yi-Coder: A Small but Mighty LLM for Code
    5 projects | news.ycombinator.com | 5 Sep 2024
    I understand your situation. It sounds super simple to me now but I remember having to spend at least a week trying to get the concepts and figuring out what prerequisite knowledge I would need between a continium of just using chatgpt and learning relevant vector math etc. It is much closer to the chatgpt side fortunately. I don't like ollama per se (because i can't reuse its models due to it compressing them in its own format) but it's still a very good place to start. Any interface that lets you download models as gguf from huggingface will do just fine. Don't be turned off by the roleplaying/waifu sounding frontend names. They are all fine. This is what I mostly prefer: https://github.com/oobabooga/text-generation-webui
  • XTC: An LLM sampler that boosts creativity, breaks writing clichés
    1 project | news.ycombinator.com | 18 Aug 2024
  • Codestral Mamba
    15 projects | news.ycombinator.com | 16 Jul 2024
    Why do people recommend this instead of the much better oobabooga text-gen-webui?

    https://github.com/oobabooga/text-generation-webui

    It's like you hate settings, features, and access to many backends!

  • Why I made TabbyAPI
    4 projects | dev.to | 12 Jul 2024
    The issue is running the model. Exl2 is part of the ExllamaV2 library, but to run a model, a user needs an API server. The only option out there was using text-generation-webui (TGW), a program that bundled every loader out there into a Gradio webui. Gradio is a common “building-block” UI framework for python development and is often used for AI applications. This setup was good for a while, until it wasn’t.
  • Take control! Run ChatGPT and Github Copilot yourself!
    3 projects | dev.to | 31 May 2024
    What I described here is most optimal workflow I found to be working for me. There are multiple ways to run open source models locally worth mentioning like Oobabooga WebUI or LM Studio, however I didn't found them to be so seamless, and fit my workflow.
  • Ask HN: What is the current (Apr. 2024) gold standard of running an LLM locally?
    11 projects | news.ycombinator.com | 1 Apr 2024
    Some of the tools offer a path to doing tool use (fetching URLs and doing things with them) or RAG (searching your documents). I think Oobabooga https://github.com/oobabooga/text-generation-webui offers the latter through plugins.

    Our tool, https://github.com/transformerlab/transformerlab-app also supports the latter (document search) using local llms.

  • Ask HN: How to get started with local language models?
    6 projects | news.ycombinator.com | 17 Mar 2024
    You can use webui https://github.com/oobabooga/text-generation-webui

    Once you get a version up and running I make a copy before I update it as several times updates have broken my working version and caused headaches.

    a decent explanation of parameters outside of reading archive papers: https://github.com/oobabooga/text-generation-webui/wiki/03-%...

    a news ai website:

  • text-generation-webui VS LibreChat - a user suggested alternative
    2 projects | 29 Feb 2024
  • Show HN: I made an app to use local AI as daily driver
    31 projects | news.ycombinator.com | 27 Feb 2024
  • A note from our sponsor - SaaSHub
    www.saashub.com | 19 Jan 2025
    SaaSHub helps you find the best software and product alternatives Learn more →

Stats

Basic text-generation-webui repo stats
882
41,649
9.8
2 days ago

Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com

Did you know that Python is
the 2nd most popular programming language
based on number of references?