Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
KoboldAI-Client Alternatives
Similar projects and alternatives to KoboldAI-Client
-
-
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
-
-
Open-Assistant
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
-
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
koboldcpp
A simple one-file way to run various GGML and GGUF models with KoboldAI's UI
-
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
petals
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
-
TavernAI
Atmospheric adventure chat for AI language models (KoboldAI, NovelAI, Pygmalion, OpenAI chatgpt, gpt-4)
-
mesh-transformer-jax
Model parallel transformers in JAX and Haiku
-
chaiNNer
A node-based image processing GUI aimed at making chaining image processing tasks easy and customizable. Born as an AI upscaling application, chaiNNer has grown into an extremely flexible and powerful programmatic image processing application.
-
Clover-Edition
Discontinued State of the art AI plays dungeon master to your adventures.
-
-
gpt-neo_dungeon
Colab notebooks to run a basic AI Dungeon clone using gpt-neo-2.7B
-
KoboldAI-Horde-Bridge
Discontinued Turns KoboldAI into a crowdsourced distributed cluster
-
mesh-transformer-jax
Fork of kingoflolz/mesh-transformer-jax with memory usage optimizations and support for GPT-Neo, GPT-NeoX, BLOOM, OPT and fairseq dense LM. Primarily used by KoboldAI and mtj-softtuner. (by VE-FORBRYDERNE)
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
KoboldAI-Client reviews and mentions
- No idea what I'm doing help
-
ChatGPT users drop for the first time as people turn to uncensored chatbots
You can use KoboldAI to run a LLM locally. There are hundreds / thousands of models on hugging face. Some uncensored ones are Pygmalion AI (chatbot), Erebus (story writing AI), or Vicuna (general purpose).
-
Question regarding model compatibility for Alpaca Turbo
Then there are graphical user interfaces like text-generation-webui and gpt4all for general purpose chat. There are also KoboldAI and SillyTavern, they have focus more on storytelling and roleplay and have tools to improve that.
-
Running Multiple AI Models Sequentially for a Conversation on a Single GPU
And finally the folks from the KoboldAi do some interesting stuff with Pseudocode and Soft-Prompts that might also be relevant.
- Summoning Life-Size Characters to Your Room: New Update for my Mixed Reality App!
- Feels like the censorship has gotten tighter recently, just me?
-
Difficulties installing Pygmalion 13b
Do you believe the problem could be that my KoboldAI is outdated? I did download the one from henk717 at https://github.com/KoboldAI/KoboldAI-Client but it was a little while ago.
-
Training code and dataset for ProfitsBot_V0 experiments
I'm not to familiar with kobald, but it looks like you would have to write an inputModifier and outputModifier, https://github.com/KoboldAI/KoboldAI-Client/pull/4
-
ADOBE being ADOBE...
This is a crowdsourced distributed cluster of Image generation workers and text generation workers.
-
How to run pygmalion: usefull links
The main branch of kai (https://github.com/KoboldAI/KoboldAI-Client) doesn't yet have the support for 4 bit models. That's a problem for people who have under 16gb of VRAM. I use a branch with 4 bit support: https://github.com/0cc4m/KoboldAI. Instructions are available there but basically you'll need to get both the original model https://huggingface.co/PygmalionAI/pygmalion-6b and the 4 bit version https://huggingface.co/mayaeary/pygmalion-6b-4bit-128g. Throw 4 bit safetensors file into the full model and rename it to "4bit-128g.safetensors".
-
A note from our sponsor - InfluxDB
www.influxdata.com | 17 Apr 2024
Stats
KoboldAI/KoboldAI-Client is an open source project licensed under GNU Affero General Public License v3.0 which is an OSI approved license.
The primary programming language of KoboldAI-Client is Python.
Popular Comparisons
- KoboldAI-Client VS TavernAI
- KoboldAI-Client VS text-generation-webui
- KoboldAI-Client VS Open-Assistant
- KoboldAI-Client VS KoboldAI
- KoboldAI-Client VS Clover-Edition
- KoboldAI-Client VS stable-diffusion-webui
- KoboldAI-Client VS llama
- KoboldAI-Client VS gpt-neo_dungeon
- KoboldAI-Client VS koboldcpp
- KoboldAI-Client VS InvokeAI