Fast-Transformer
Perceiver
Fast-Transformer | Perceiver | |
---|---|---|
4 | 7 | |
150 | 86 | |
- | - | |
0.0 | 2.6 | |
almost 3 years ago | over 3 years ago | |
Jupyter Notebook | Python | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Fast-Transformer
Perceiver
- I implemented Deepmind's new Perceiver Model
- I Implemented Deepmind's Perceiver Model
-
[P] I implemented DeepMind's "Perceiver" in PyTorch
Great one, I implemented the Perceiver model too in TensorFlow: https://github.com/Rishit-dagli/Perceiver
- Deepmind's New Perceiver Model
-
[P] Implementing Perceiver: General perception with Iterative Attention in TensorFlow
The project: https://github.com/Rishit-dagli/Perceiver
- Perceiver, General Perception with Iterative Attention
What are some alternatives?
reformer-pytorch - Reformer, the efficient Transformer, in Pytorch
Swin-Transformer-Object-Detection - This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" on Object Detection and Instance Segmentation.
Conformer - An implementation of Conformer: Convolution-augmented Transformer for Speech Recognition, a Transformer Variant in TensorFlow/Keras
performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
deepmind-perceiver - My implementation of DeepMind's Perceiver
Transformer-in-Transformer - An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
machine-learning-experiments - 🤖 Interactive Machine Learning experiments: 🏋️models training + 🎨models demo
gato - Unofficial Gato: A Generalist Agent
language-planner - Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
conformer - Implementation of the convolutional module from the Conformer paper, for use in Transformers