SillyTavern-extras
koboldcpp
SillyTavern-extras | koboldcpp | |
---|---|---|
6 | 180 | |
126 | 4,133 | |
- | - | |
10.0 | 10.0 | |
about 1 year ago | 6 days ago | |
Python | C++ | |
The Unlicense | GNU Affero General Public License v3.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
SillyTavern-extras
- We should have a detailed guide to instal the extras or the new Smart Context module of ChromaDB
-
How do I use the extras on sillytaven?
You may want clone SillyTavern Extras repo and follow the instructions there.
-
How can I install Stablediffusion (or Waifudiffusion, if possible) onto Sillytavern?
You need to install and run silly tavern extras https://github.com/Cohee1207/SillyTavern-extras
-
Anyone hosting a local LLM server
I'm running an essentially "headless" setup with SillyTavern, SillyTavern-extras, ooba and vlandmadic's Stable Diffusion fork on a WSL(2) VM (Ubuntu 22.04), where:
- Wizard-Vicuna-13B-Uncensored is seriously impressive.
- I'm not sure if this is the right place to post this. But I would like to create a companion and host them on my PC.
koboldcpp
- Any Online Communities on Local/Home AI?
- Koboldcpp-1.62.1 adds support for Command-R+
- Show HN: I made an app to use local AI as daily driver
-
Easiest way to show my model to my mom?
FYI this is the easiest way to host on the horde: https://github.com/LostRuins/koboldcpp
- IT Veteran... why am I struggling with all of this?
- What do you use to run your models?
- ByteDance AI researcher suggests that open source model more powerful than Gemini to be released soon
- i need some help guys
-
[Guide] How install KoboldAI in Android via Termux (Update 04-12-2023)
For more information of Koboldcpp look this guide: https://github.com/LostRuins/koboldcpp/wiki
-
SillyTavern 1.10.10 has been released
Out of curiosity, is there a specific reason for this? The most popular fork KoboldCpp is in active development, and was the first to adopt the Min P sampler, and even distincts itself with the context shift feature. Just wondering what this means for the future. Thanks!
What are some alternatives?
SillyTavern - LLM Frontend for Power Users.
KoboldAI
docker - Docker - the open-source application container engine
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
SillyTavern-Extras - Extensions API for SillyTavern.
TavernAI - Atmospheric adventure chat for AI language models (KoboldAI, NovelAI, Pygmalion, OpenAI chatgpt, gpt-4)
SillyTavern - LLM Frontend for Power Users. [Moved to: https://github.com/SillyTavern/SillyTavern]
KoboldAI - KoboldAI is generative AI software optimized for fictional use, but capable of much more!
simple-proxy-for-tavern
ChatRWKV - ChatRWKV is like ChatGPT but powered by RWKV (100% RNN) language model, and open source.
SillyTavernSimpleLauncher - A launcher that let's you install, uninstall, update, backup and uninstall SillyTavern and SillyTavernExtras