dbrx
Code examples and resources for DBRX, a large language model developed by Databricks (by databricks)
mixtral-offloading
Run Mixtral-8x7B models in Colab or consumer desktops (by dvmazur)
dbrx | mixtral-offloading | |
---|---|---|
4 | 3 | |
2,407 | 2,238 | |
94.6% | - | |
5.9 | 8.6 | |
8 days ago | about 1 month ago | |
Python | Python | |
GNU General Public License v3.0 or later | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
dbrx
Posts with mentions or reviews of dbrx.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2024-04-08.
-
Hello OLMo: A Open LLM
One thing I wanted to add and call attention to is the importance of licensing in open models. This is often overlooked when we blindly accept the vague branding of models as “open”, but I am noticing that many open weight models are actually using encumbered proprietary licenses rather than standard open source licenses that are OSI approved (https://opensource.org/licenses). As an example, Databricks’s DBRX model has a proprietary license that forces adherence to their highly restrictive Acceptable Use Policy by referencing a live website hosting their AUP (https://github.com/databricks/dbrx/blob/main/LICENSE), which means as they change their AUP, you may be further restricted in the future. Meta’s Llama is similar (https://github.com/meta-llama/llama/blob/main/LICENSE ). I’m not sure who can depend on these models given this flaw.
-
DBRX: A New Open LLM
Sorry, I forgot to link the repository and missed the edit window by the time I realized.
[1] https://github.com/databricks/dbrx
mixtral-offloading
Posts with mentions or reviews of mixtral-offloading.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2024-03-27.
-
DBRX: A New Open LLM
Waiting for Mixed Quantization with MQQ and MoE Offloading [1]. With that I was able to run Mistral 8x7B on my 10 GB VRAM rtx3080... This should work for DBRX and should shave off a ton of VRAM requirement.
1. https://github.com/dvmazur/mixtral-offloading?tab=readme-ov-...
- Mixtral in Colab
- Run Mixtral-8x7B models in Colab or consumer desktops
What are some alternatives?
When comparing dbrx and mixtral-offloading you can also consider the following projects:
llama - Inference code for Llama models
lightning-mlflow-hf - Use QLoRA to tune LLM in PyTorch-Lightning w/ Huggingface + MLflow
xTuring - Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
OLMo - Modeling, training, eval, and inference code for OLMo
makeMoE - From scratch implementation of a sparse mixture of experts language model inspired by Andrej Karpathy's makemore :)