dbrx
makeMoE
dbrx | makeMoE | |
---|---|---|
4 | 3 | |
2,455 | 538 | |
3.7% | - | |
5.8 | 9.0 | |
about 2 months ago | 2 months ago | |
Python | Jupyter Notebook | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
dbrx
-
Hello OLMo: A Open LLM
One thing I wanted to add and call attention to is the importance of licensing in open models. This is often overlooked when we blindly accept the vague branding of models as “open”, but I am noticing that many open weight models are actually using encumbered proprietary licenses rather than standard open source licenses that are OSI approved (https://opensource.org/licenses). As an example, Databricks’s DBRX model has a proprietary license that forces adherence to their highly restrictive Acceptable Use Policy by referencing a live website hosting their AUP (https://github.com/databricks/dbrx/blob/main/LICENSE), which means as they change their AUP, you may be further restricted in the future. Meta’s Llama is similar (https://github.com/meta-llama/llama/blob/main/LICENSE ). I’m not sure who can depend on these models given this flaw.
-
DBRX: A New Open LLM
Sorry, I forgot to link the repository and missed the edit window by the time I realized.
[1] https://github.com/databricks/dbrx
makeMoE
-
DBRX: A New Open LLM
This repo I created and the linked blog will help in understanding this: https://github.com/AviSoori1x/makeMoE
- FLaNK AI Weekly 25 March 2025
- Implementation of mixture of experts language model in a single file of PyTorch
What are some alternatives?
OLMo - Modeling, training, eval, and inference code for OLMo
mergekit - Tools for merging pretrained large language models.
llama - Inference code for Llama models
spring-ai - An Application Framework for AI Engineering
tabby-backend-py - Tabby (self-hosted AI coding assistant) server in 20 lines of python
FeatUp - Official code for "FeatUp: A Model-Agnostic Frameworkfor Features at Any Resolution" ICLR 2024
xTuring - Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
examples - This repository will contain examples of use cases that utilize Decodable streaming solution
mixtral-offloading - Run Mixtral-8x7B models in Colab or consumer desktops