gpt-neox VS lm-evaluation-harness

Compare gpt-neox vs lm-evaluation-harness and see what are their differences.

gpt-neox

An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library. (by EleutherAI)

lm-evaluation-harness

A framework for few-shot evaluation of language models. (by EleutherAI)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
gpt-neox lm-evaluation-harness
52 34
6,556 4,848
2.0% 15.6%
9.0 9.9
2 days ago 4 days ago
Python Python
Apache License 2.0 MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

gpt-neox

Posts with mentions or reviews of gpt-neox. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-02-26.

lm-evaluation-harness

Posts with mentions or reviews of lm-evaluation-harness. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-09.

What are some alternatives?

When comparing gpt-neox and lm-evaluation-harness you can also consider the following projects:

fairseq - Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

BIG-bench - Beyond the Imitation Game collaborative benchmark for measuring and extrapolating the capabilities of language models

gpt-neo - An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.

aitextgen - A robust Python tool for text-based AI training and generation using GPT-2.

DeepSpeed - DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

YaLM-100B - Pretrained language model with 100B parameters

StableLM - StableLM: Stability AI Language Models

open-ai - OpenAI PHP SDK : Most downloaded, forked, contributed, huge community supported, and used PHP (Laravel , Symfony, Yii, Cake PHP or any PHP framework) SDK for OpenAI GPT-3 and DALL-E. It also supports chatGPT-like streaming. (ChatGPT AI is supported)

transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Megatron-DeepSpeed - Ongoing research training transformer language models at scale, including: BERT & GPT-2

koboldcpp - A simple one-file way to run various GGML and GGUF models with KoboldAI's UI