- deep-implicit-attention VS performer-pytorch
- deep-implicit-attention VS soundstorm-pytorch
- deep-implicit-attention VS TimeSformer-pytorch
- deep-implicit-attention VS spin-model-transformers
- deep-implicit-attention VS afem
- deep-implicit-attention VS x-transformers
- deep-implicit-attention VS DALLE-pytorch
- deep-implicit-attention VS vit-pytorch
Deep-implicit-attention Alternatives
Similar projects and alternatives to deep-implicit-attention
-
-
Scout Monitoring
Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
-
soundstorm-pytorch
Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch
-
TimeSformer-pytorch
Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
-
spin-model-transformers
Physics-inspired transformer modules based on mean-field dynamics of vector-spin models in JAX
-
-
x-transformers
A concise but complete full-attention transformer with a set of promising experimental features from various papers
-
DALLE-pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
-
CodeRabbit
CodeRabbit: AI Code Reviews for Developers. Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.
-
vit-pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
deep-implicit-attention discussion
deep-implicit-attention reviews and mentions
-
[P] Deep Implicit Attention: A Mean-Field Theory Perspective on Attention Mechanisms
Code: https://github.com/mcbal/deep-implicit-attention
Stats
mcbal/deep-implicit-attention is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of deep-implicit-attention is Python.