mesh-transformer-jax
KoboldAI-Client | mesh-transformer-jax | |
---|---|---|
185 | 52 | |
3,369 | 6,213 | |
- | - | |
6.3 | 0.0 | |
2 months ago | over 1 year ago | |
Python | Python | |
GNU Affero General Public License v3.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
KoboldAI-Client
- No idea what I'm doing help
-
ChatGPT users drop for the first time as people turn to uncensored chatbots
You can use KoboldAI to run a LLM locally. There are hundreds / thousands of models on hugging face. Some uncensored ones are Pygmalion AI (chatbot), Erebus (story writing AI), or Vicuna (general purpose).
-
Tips for using Kobold with Venus? I am pretty new at everything.
GPT-J 6B is a pretty weak and outdated model. Nerys 13B would probably give you better replies but they lean more towards SFW stuff. Erebus was their best model for erotic roleplay but they removed it as it went against Google's TOS. You can check out their documentation here.
-
I can't do this y'all
If you do have that kind of hardware, the next step would be looking for what model to run. I came across Kobold's models. Their main github page is here: https://github.com/KoboldAI/KoboldAI-Client
-
Question regarding model compatibility for Alpaca Turbo
Then there are graphical user interfaces like text-generation-webui and gpt4all for general purpose chat. There are also KoboldAI and SillyTavern, they have focus more on storytelling and roleplay and have tools to improve that.
-
Running Multiple AI Models Sequentially for a Conversation on a Single GPU
And finally the folks from the KoboldAi do some interesting stuff with Pseudocode and Soft-Prompts that might also be relevant.
- Summoning Life-Size Characters to Your Room: New Update for my Mixed Reality App!
- Feels like the censorship has gotten tighter recently, just me?
-
How to get a KoboldAI URL API key!
Click this link. ---> https://github.com/KoboldAI/KoboldAI-Client/tree/main
-
Difficulties installing Pygmalion 13b
Do you believe the problem could be that my KoboldAI is outdated? I did download the one from henk717 at https://github.com/KoboldAI/KoboldAI-Client but it was a little while ago.
mesh-transformer-jax
-
Large Language Models: Compairing Gen2/Gen3 Models (GPT-3, GPT-J, MT5 and More)
GPT-J is a LLM case study with two goals: Training a LLM with a data source containing unique material, and using the training frameworkMesh Transformer JAX to achieve a high training efficiency through parallelization. There is no research paper about GPT-J, but on its GitHub pages, the model, different checkpoints, and the complete source code for training is given.
-
[R] Parallel Attention and Feed-Forward Net Design for Pre-training and Inference on Transformers
This idea has already been proposed in ViT-22B and GPT-J-6B.
- Show HN: Finetune LLaMA-7B on commodity GPUs using your own text
-
[D] An Instruct Version Of GPT-J Using Stanford Alpaca's Dataset
Sure. Here's the repo I used for the fine-tuning: https://github.com/kingoflolz/mesh-transformer-jax. I used 5 epochs, and appart from that I kept the default parameters in the repo.
- Boss wants me to use ChatGPT for work, but I refuse to input my personal phone number. Any advice?
-
Let's build GPT: from scratch, in code, spelled out by Andrej Karpathy
You can skip to step 4 using something like GPT-J as far as I understand: https://github.com/kingoflolz/mesh-transformer-jax#links
The pretrained model is already available.
-
Best coding model?
The Github repo suggests it's possible you can change the number of checkpoints to make it run on a GPU.
- Ask HN: What language models can I fine-tune at home?
-
selfhosted/ open-source ChatGPT alternative?
GPT-J, which uses mesh-transformer-jax: https://github.com/kingoflolz/mesh-transformer-jax
-
GPT-J, an open-source alternative to GPT-3
They hinted at it in the screenshot, but the goods are linked from the https://6b.eleuther.ai page: https://github.com/kingoflolz/mesh-transformer-jax#gpt-j-6b (Apache 2)
What are some alternatives?
TavernAI - Atmospheric adventure chat for AI language models (KoboldAI, NovelAI, Pygmalion, OpenAI chatgpt, gpt-4)
DeepSpeed - DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
tensorflow - An Open Source Machine Learning Framework for Everyone
Open-Assistant - OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
gpt-neo - An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
KoboldAI - KoboldAI is generative AI software optimized for fictional use, but capable of much more!
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Clover-Edition - State of the art AI plays dungeon master to your adventures.
alpaca-lora - Instruct-tune LLaMA on consumer hardware
stable-diffusion-webui - Stable Diffusion web UI
Finetune_LLMs - Repo for fine-tuning Casual LLMs