exllama
A more memory-efficient rewrite of the HF transformers implementation of Llama for use with quantized weights. (by 0cc4m)
magi_llm_gui
A Qt GUI for large language models (by shinomakoi)
exllama | magi_llm_gui | |
---|---|---|
6 | 4 | |
7 | 39 | |
- | - | |
9.0 | 8.7 | |
7 months ago | 7 months ago | |
Python | Python | |
MIT License | Apache License 2.0 |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
exllama
Posts with mentions or reviews of exllama.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-06-27.
-
exlamma? Questions about stuff 4bit and things
But you also need to manually install https://github.com/0cc4m/exllama/releases/tag/0.0.5 by opening the KoboldAI command prompt and running the "pip install" command followed by the whl file you downloaded.
- Anyone tried this promising sounding release? WizardLM-33B-V1.0-Uncensored-SUPERHOT-8K
- How is ExLlama so good? Can it be used with a more feature rich UI?
- Damn, I was so satisfied with my 3080 with 10GB of VRAM until I found this subreddit.
- EXLlama test on 2x4090, Windows 11 and Ryzen 7 7800X3D
magi_llm_gui
Posts with mentions or reviews of magi_llm_gui.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-07-04.
-
What is the best text web ui currently?
Other than Ooba, this is my fav (and works with a TON of model architectures) -> https://github.com/shinomakoi/magi_llm_gui
- How is ExLlama so good? Can it be used with a more feature rich UI?
-
What's an alternative to oobabooga?
Magi LLM GUI - https://github.com/shinomakoi/magi_llm_gui
- Maji LLM: A Qt Desktop GUI for local language models. Works with Oobabooga's WebUI API and llama.cpp
What are some alternatives?
When comparing exllama and magi_llm_gui you can also consider the following projects:
KoboldAI
lollms-webui - Lord of Large Language Models Web User Interface
exllama - A more memory-efficient rewrite of the HF transformers implementation of Llama for use with quantized weights.
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
gpt4all - gpt4all: run open-source LLMs anywhere
SillyTavern-Extras - Extensions API for SillyTavern.
llama.cpp - LLM inference in C/C++
SillyTavern - LLM Frontend for Power Users.
simple-proxy-for-tavern
exllama vs KoboldAI
magi_llm_gui vs lollms-webui
exllama vs exllama
magi_llm_gui vs text-generation-webui
exllama vs gpt4all
magi_llm_gui vs SillyTavern-Extras
magi_llm_gui vs KoboldAI
magi_llm_gui vs llama.cpp
magi_llm_gui vs SillyTavern
magi_llm_gui vs simple-proxy-for-tavern
magi_llm_gui vs gpt4all