Transformer-in-Transformer VS Fast-Transformer

Compare Transformer-in-Transformer vs Fast-Transformer and see what are their differences.

Transformer-in-Transformer

An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches (by Rishit-dagli)

Fast-Transformer

An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow (by Rishit-dagli)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
Transformer-in-Transformer Fast-Transformer
4 4
41 146
- -
0.0 0.0
about 2 years ago about 2 years ago
Jupyter Notebook Jupyter Notebook
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Transformer-in-Transformer

Posts with mentions or reviews of Transformer-in-Transformer. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-12-06.

Fast-Transformer

Posts with mentions or reviews of Fast-Transformer. We have used some of these posts to build our list of alternatives and similar projects.

What are some alternatives?

When comparing Transformer-in-Transformer and Fast-Transformer you can also consider the following projects:

poolformer - PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)

reformer-pytorch - Reformer, the efficient Transformer, in Pytorch

LongNet - Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"

Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow

AvatarGAN - Generate Cartoon Images using Generative Adversarial Network

Conformer - An implementation of Conformer: Convolution-augmented Transformer for Speech Recognition, a Transformer Variant in TensorFlow/Keras

swarms - Orchestrate Swarms of Agents From Any Framework Like OpenAI, Langchain, and Etc for Real World Workflow Automation. Join our Community: https://discord.gg/DbjBMJTSWD

machine-learning-experiments - 🤖 Interactive Machine Learning experiments: 🏋️models training + 🎨models demo

principia - The Principia Rewrite

TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification

planckforth - Bootstrapping a Forth interpreter from hand-written tiny ELF binary. Just for fun.

embedding-encoder - Scikit-Learn compatible transformer that turns categorical variables into dense entity embeddings.