deep-implicit-attention
TimeSformer-pytorch
deep-implicit-attention | TimeSformer-pytorch | |
---|---|---|
1 | 1 | |
63 | 694 | |
- | - | |
0.0 | 0.0 | |
over 3 years ago | over 3 years ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
deep-implicit-attention
-
[P] Deep Implicit Attention: A Mean-Field Theory Perspective on Attention Mechanisms
Code: https://github.com/mcbal/deep-implicit-attention
TimeSformer-pytorch
What are some alternatives?
performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch
DALLE-pytorch - Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
soundstorm-pytorch - Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch
CoCa-pytorch - Implementation of CoCa, Contrastive Captioners are Image-Text Foundation Models, in Pytorch
spin-model-transformers - Physics-inspired transformer modules based on mean-field dynamics of vector-spin models in JAX
x-transformers - A concise but complete full-attention transformer with a set of promising experimental features from various papers
afem - Implementation of approximate free-energy minimization in PyTorch
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
perceiver-pytorch - Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch