Python mixture-of-experts

Open-source Python projects categorized as mixture-of-experts

Top 9 Python mixture-of-expert Projects

  • DeepSpeed

    DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

  • Project mention: Can we discuss MLOps, Deployment, Optimizations, and Speed? | /r/LocalLLaMA | 2023-12-06

    DeepSpeed can handle parallelism concerns, and even offload data/model to RAM, or even NVMe (!?) . I'm surprised I don't see this project used more.

  • LLaMA-Factory

    Unify Efficient Fine-Tuning of 100+ LLMs

  • Project mention: Show HN: GPU Prices on eBay | | 2024-02-23

    Depends what model you want to train, and how well you want your computer to keep working while you're doing it.

    If you're interested in large language models there's a table of vram requirements for fine-tuning at [1] which says you could do the most basic type of fine-tuning on a 7B parameter model with 8GB VRAM.

    You'll find that training takes quite a long time, and as a lot of the GPU power is going on training, your computer's responsiveness will suffer - even basic things like scrolling in your web browser or changing tabs uses the GPU, after all.

    Spend a bit more and you'll probably have a better time.


  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
  • mixtral-offloading

    Run Mixtral-8x7B models in Colab or consumer desktops

  • Project mention: DBRX: A New Open LLM | | 2024-03-27

    Waiting for Mixed Quantization with MQQ and MoE Offloading [1]. With that I was able to run Mistral 8x7B on my 10 GB VRAM rtx3080... This should work for DBRX and should shave off a ton of VRAM requirement.


  • hivemind

    Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.

  • Project mention: You can now train a 70B language model at home | | 2024-03-07 is also relevant

  • mixture-of-experts

    PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al.

  • Project mention: [Rumor] Potential GPT-4 architecture description | /r/LocalLLaMA | 2023-06-20
  • tutel

    Tutel MoE: An Optimized Mixture-of-Experts Implementation

  • mixture-of-experts

    A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models (by lucidrains)

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • st-moe-pytorch

    Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch

  • Project mention: will the point meet in 2024? | /r/LocalLLaMA | 2023-12-05
  • mergoo

    A library for easily merging multiple LLM experts, and efficiently train the merged LLM.

  • Project mention: A Library to build MoE from HF models | | 2024-04-08

NOTE: The open source projects on this list are ordered by number of github stars. The number of mentions indicates repo mentiontions in the last 12 Months or since we started tracking (Dec 2020). The latest post mention was on 2024-04-08.

Python mixture-of-experts related posts


What are some of the best open-source mixture-of-expert projects in Python? This list will help you:

Project Stars
1 DeepSpeed 32,447
2 LLaMA-Factory 16,319
3 mixtral-offloading 2,226
4 hivemind 1,833
5 mixture-of-experts 818
6 tutel 648
7 mixture-of-experts 513
8 st-moe-pytorch 213
9 mergoo 127

SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives