Perceiver
Fast-Transformer
Perceiver | Fast-Transformer | |
---|---|---|
7 | 4 | |
86 | 148 | |
- | - | |
2.6 | 0.0 | |
over 3 years ago | over 2 years ago | |
Python | Jupyter Notebook | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Perceiver
- I implemented Deepmind's new Perceiver Model
- I Implemented Deepmind's Perceiver Model
-
[P] I implemented DeepMind's "Perceiver" in PyTorch
Great one, I implemented the Perceiver model too in TensorFlow: https://github.com/Rishit-dagli/Perceiver
- Deepmind's New Perceiver Model
-
[P] Implementing Perceiver: General perception with Iterative Attention in TensorFlow
The project: https://github.com/Rishit-dagli/Perceiver
- Perceiver, General Perception with Iterative Attention
Fast-Transformer
What are some alternatives?
Swin-Transformer-Object-Detection - This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" on Object Detection and Instance Segmentation.
reformer-pytorch - Reformer, the efficient Transformer, in Pytorch
performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch
Conformer - An implementation of Conformer: Convolution-augmented Transformer for Speech Recognition, a Transformer Variant in TensorFlow/Keras
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
Transformer-in-Transformer - An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
deepmind-perceiver - My implementation of DeepMind's Perceiver
gato - Unofficial Gato: A Generalist Agent
machine-learning-experiments - 🤖 Interactive Machine Learning experiments: 🏋️models training + 🎨models demo
conformer - Implementation of the convolutional module from the Conformer paper, for use in Transformers
embedding-encoder - Scikit-Learn compatible transformer that turns categorical variables into dense entity embeddings.