Ingest, store, & analyze all types of time series data in a fully-managed, purpose-built database. Keep data forever with low-cost storage and superior data compression. Learn more →
Top 5 Python mixture-of-expert Projects
-
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Project mention: Using --deepspeed requires lots of manual tweaking | reddit.com/r/Oobabooga | 2023-05-11Filed a discussion item on the deepspeed project: https://github.com/microsoft/DeepSpeed/discussions/3531
-
hivemind
Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.
Project mention: Do you think that AI research will slow down to a halt because of regulation? | reddit.com/r/singularity | 2023-05-21not if we rise to meet that challenge. here's a few tools that facilitate AI research in the face of an advanced persistent threat: Hivemind- a distributed Pytorch framework
-
ONLYOFFICE
ONLYOFFICE Docs — document collaboration in your environment. Powerful document editing and collaboration in your app or environment. Ultimate security, API and 30+ ready connectors, SaaS or on-premises
-
mixture-of-experts
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
-
-
mixture-of-experts
A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models (by lucidrains)
-
CodiumAI
TestGPT | Generating meaningful tests for busy devs. Get non-trivial tests (and trivial, too!) suggested right inside your IDE, so you can code smart, create more value, and stay confident when you push.
Python mixture-of-experts related posts
- Do you think that AI research will slow down to a halt because of regulation?
- [D] Google "We Have No Moat, And Neither Does OpenAI": Leaked Internal Google Document Claims Open Source AI Will Outcompete Google and OpenAI
- Run 100B+ language models at home, BitTorrent‑style
- We are building an AI that will be managed and trained by a DAO. It'll use GolemNetwork in order to process its models.
- CAI chose the path of failure. I'd like to offer a unique skill I have to help Pygmalion
- GPT-4 Will Be 500x Smaller Than People Think - Here Is Why
- Ask HN: Can you crowdfund the compute for GPT?
-
A note from our sponsor - InfluxDB
www.influxdata.com | 29 May 2023
Index
What are some of the best open-source mixture-of-expert projects in Python? This list will help you:
Project | Stars | |
---|---|---|
1 | DeepSpeed | 25,088 |
2 | hivemind | 1,502 |
3 | mixture-of-experts | 442 |
4 | tutel | 411 |
5 | mixture-of-experts | 261 |