Perceiver
conformer
Perceiver | conformer | |
---|---|---|
7 | 1 | |
85 | 333 | |
- | - | |
2.6 | 3.1 | |
about 3 years ago | 12 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Perceiver
- I implemented Deepmind's new Perceiver Model
- I Implemented Deepmind's Perceiver Model
-
[P] I implemented DeepMind's "Perceiver" in PyTorch
Great one, I implemented the Perceiver model too in TensorFlow: https://github.com/Rishit-dagli/Perceiver
- Deepmind's New Perceiver Model
-
[P] Implementing Perceiver: General perception with Iterative Attention in TensorFlow
The project: https://github.com/Rishit-dagli/Perceiver
- Perceiver, General Perception with Iterative Attention
conformer
-
[N] Conformer-1 - A state-of-the-art speech recognition model trained on 650K hours of data
Found relevant code at https://github.com/lucidrains/conformer + all code implementations here
What are some alternatives?
Swin-Transformer-Object-Detection - This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" on Object Detection and Instance Segmentation.
routing-transformer - Fully featured implementation of Routing Transformer
performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch
enformer-pytorch - Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Fast-Transformer - An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow
tab-transformer-pytorch - Implementation of TabTransformer, attention network for tabular data, in Pytorch
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
uformer-pytorch - Implementation of Uformer, Attention-based Unet, in Pytorch
gato - Unofficial Gato: A Generalist Agent
Nystromformer - An implementation of the Nyströmformer, using Nystrom method to approximate standard self attention
deepmind-perceiver - My implementation of DeepMind's Perceiver
mixture-of-experts - A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models