mixture-of-experts
A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models (by lucidrains)
st-moe-pytorch
Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch (by lucidrains)
mixture-of-experts | st-moe-pytorch | |
---|---|---|
1 | 1 | |
525 | 223 | |
- | - | |
4.1 | 7.8 | |
8 months ago | 2 months ago | |
Python | Python | |
MIT License | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
mixture-of-experts
Posts with mentions or reviews of mixture-of-experts.
We have used some of these posts to build our list of alternatives
and similar projects.
-
How to Go beyond Data Parallelism and Model Parallelism: Talking from GShard[R]
Code for https://arxiv.org/abs/2006.16668 found: https://github.com/lucidrains/mixture-of-experts
st-moe-pytorch
Posts with mentions or reviews of st-moe-pytorch.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-12-05.
What are some alternatives?
When comparing mixture-of-experts and st-moe-pytorch you can also consider the following projects:
uformer-pytorch - Implementation of Uformer, Attention-based Unet, in Pytorch
OpenMoE - A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
conformer - Implementation of the convolutional module from the Conformer paper, for use in Transformers
spaCy - 💫 Industrial-strength Natural Language Processing (NLP) in Python