gpt-neox VS Megatron-DeepSpeed

Compare gpt-neox vs Megatron-DeepSpeed and see what are their differences.

gpt-neox

An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library. (by EleutherAI)

Megatron-DeepSpeed

Ongoing research training transformer language models at scale, including: BERT & GPT-2 (by bigscience-workshop)
Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
gpt-neox Megatron-DeepSpeed
52 1
6,569 1,242
2.2% 5.6%
8.9 2.4
4 days ago about 1 month ago
Python Python
Apache License 2.0 GNU General Public License v3.0 or later
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

gpt-neox

Posts with mentions or reviews of gpt-neox. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-02-26.

Megatron-DeepSpeed

Posts with mentions or reviews of Megatron-DeepSpeed. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-02-17.
  • [D] DeepSpeed vs PyTorch native API
    3 projects | /r/MachineLearning | 17 Feb 2022
    Both EleutherAI's gpt-neox and the BigScience project use DeepSpeed under the hood, probably because DeepSpeed still remains the best component for training large models. So really dependent on your scale if DeepSpeed is still your answer, or if you can get away with these native PyTorch alternatives.

What are some alternatives?

When comparing gpt-neox and Megatron-DeepSpeed you can also consider the following projects:

fairseq - Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

fairscale - PyTorch extensions for high performance and large scale training.

gpt-neo - An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.

DeepSpeed - DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

YaLM-100B - Pretrained language model with 100B parameters

open-ai - OpenAI PHP SDK : Most downloaded, forked, contributed, huge community supported, and used PHP (Laravel , Symfony, Yii, Cake PHP or any PHP framework) SDK for OpenAI GPT-3 and DALL-E. It also supports chatGPT-like streaming. (ChatGPT AI is supported)

lm-evaluation-harness - A framework for few-shot evaluation of language models.

haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.

Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration

Megatron-DeepSpeed - Ongoing research training transformer language models at scale, including: BERT & GPT-2

kiri - Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.

gpt-2 - Code for the paper "Language Models are Unsupervised Multitask Learners"