x-transformers VS minGPT

Compare x-transformers vs minGPT and see what are their differences.

x-transformers

A simple but complete full-attention transformer with a set of promising experimental features from various papers (by lucidrains)

minGPT

A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training (by karpathy)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
x-transformers minGPT
10 35
4,147 18,875
- -
8.7 0.0
3 days ago 4 days ago
Python Python
MIT License MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

x-transformers

Posts with mentions or reviews of x-transformers. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-12-26.

minGPT

Posts with mentions or reviews of minGPT. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-29.

What are some alternatives?

When comparing x-transformers and minGPT you can also consider the following projects:

EasyOCR - Ready-to-use OCR with 80+ supported languages and all popular writing scripts including Latin, Chinese, Arabic, Devanagari, Cyrillic and etc.

nanoGPT - The simplest, fastest repository for training/finetuning medium-sized GPTs.

TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification

gpt-2 - Code for the paper "Language Models are Unsupervised Multitask Learners"

flamingo-pytorch - Implementation of 🦩 Flamingo, state-of-the-art few-shot visual question answering attention net out of Deepmind, in Pytorch

simpletransformers - Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI

DALLE-pytorch - Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch

Pytorch-Simple-Transformer - A simple transformer implementation without difficult syntax and extra bells and whistles.

memory-efficient-attention-pytorch - Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"

nn-zero-to-hero - Neural Networks: Zero to Hero

performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch

huggingface_hub - The official Python client for the Huggingface Hub.