Fast-Transformer VS x-transformers

Compare Fast-Transformer vs x-transformers and see what are their differences.


An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow (by Rishit-dagli)


A simple but complete full-attention transformer with a set of promising experimental features from various papers (by lucidrains)
Our great sponsors
  • Onboard AI - Learn any GitHub repo in 59 seconds
  • InfluxDB - Collect and Analyze Billions of Data Points in Real Time
  • SaaSHub - Software Alternatives and Reviews
Fast-Transformer x-transformers
4 9
146 3,612
- -
3.2 8.0
almost 2 years ago 8 days ago
Jupyter Notebook Python
Apache License 2.0 MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.


Posts with mentions or reviews of Fast-Transformer. We have used some of these posts to build our list of alternatives and similar projects.

We haven't tracked posts mentioning Fast-Transformer yet.
Tracking mentions began in Dec 2020.


Posts with mentions or reviews of x-transformers. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-12-26.

What are some alternatives?

When comparing Fast-Transformer and x-transformers you can also consider the following projects:

EasyOCR - Ready-to-use OCR with 80+ supported languages and all popular writing scripts including Latin, Chinese, Arabic, Devanagari, Cyrillic and etc.

reformer-pytorch - Reformer, the efficient Transformer, in Pytorch

TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification

Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow

flamingo-pytorch - Implementation of 🦩 Flamingo, state-of-the-art few-shot visual question answering attention net out of Deepmind, in Pytorch

DALLE-pytorch - Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch

memory-efficient-attention-pytorch - Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"

performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch

Conformer - An implementation of Conformer: Convolution-augmented Transformer for Speech Recognition, a Transformer Variant in TensorFlow/Keras

PaLM-pytorch - Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways

SpecBAS - An enhanced Sinclair BASIC interpreter for modern PCs

euporie - Jupyter notebooks in the terminal