InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now. Learn more →
Top 10 Python mixture-of-expert Projects
-
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
-
InfluxDB
InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
-
Project mention: Show HN: DeepThink Plugin – Bring Gemini 2.5's parallel reasoning to open models | news.ycombinator.com | 2025-06-18
- Increases inference time but significantly improves answer quality
Link: https://github.com/codelion/optillm/tree/main/optillm/plugin...
-
-
hivemind
Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.
-
-
mixture-of-experts
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
-
mixture-of-experts
A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models (by lucidrains)
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
-
st-moe-pytorch
Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch
-
Python mixture-of-experts discussion
Python mixture-of-experts related posts
-
Hertz-dev, the first open-source base model for conversational audio
-
Would anyone be interested in contributing to some group projects?
-
[Rumor] Potential GPT-4 architecture description
-
Hive mind:Train deep learning models on thousands of volunteers across the world
-
Could a model not be trained by a decentralized network? Like Seti @ home or kinda-sorta like bitcoin. Petals accomplishes this somewhat, but if raw computer power is the only barrier to open-source I'd be happy to try organizing decentalized computing efforts
-
Orca (built on llama13b) looks like the new sheriff in town
-
Do you think that AI research will slow down to a halt because of regulation?
-
A note from our sponsor - InfluxDB
www.influxdata.com | 23 Jun 2025
Index
What are some of the best open-source mixture-of-expert projects in Python? This list will help you:
# | Project | Stars |
---|---|---|
1 | DeepSpeed | 39,055 |
2 | optillm | 2,550 |
3 | mixtral-offloading | 2,303 |
4 | hivemind | 2,209 |
5 | MoE-LLaVA | 2,182 |
6 | mixture-of-experts | 1,089 |
7 | mixture-of-experts | 763 |
8 | mergoo | 481 |
9 | st-moe-pytorch | 339 |
10 | attention-models | 4 |