Mixture-of-experts Alternatives
Similar projects and alternatives to mixture-of-experts based on common topics and language
-
conformer
Implementation of the convolutional module from the Conformer paper, for use in Transformers (by lucidrains)
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
st-moe-pytorch
Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch
-
tab-transformer-pytorch
Implementation of TabTransformer, attention network for tabular data, in Pytorch
-
enformer-pytorch
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
mixture-of-experts reviews and mentions
-
How to Go beyond Data Parallelism and Model Parallelism: Talking from GShard[R]
Code for https://arxiv.org/abs/2006.16668 found: https://github.com/lucidrains/mixture-of-experts
Stats
lucidrains/mixture-of-experts is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of mixture-of-experts is Python.
Popular Comparisons
Sponsored