- deep-implicit-attention VS performer-pytorch
- deep-implicit-attention VS TimeSformer-pytorch
- deep-implicit-attention VS soundstorm-pytorch
- deep-implicit-attention VS x-transformers
- deep-implicit-attention VS DALLE-pytorch
- deep-implicit-attention VS afem
- deep-implicit-attention VS spin-model-transformers
Deep-implicit-attention Alternatives
Similar projects and alternatives to deep-implicit-attention
-
TimeSformer-pytorch
Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
soundstorm-pytorch
Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch
-
x-transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers
-
DALLE-pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
-
spin-model-transformers
Physics-inspired transformer modules based on mean-field dynamics of vector-spin models in JAX
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
deep-implicit-attention reviews and mentions
-
[P] Deep Implicit Attention: A Mean-Field Theory Perspective on Attention Mechanisms
Code: https://github.com/mcbal/deep-implicit-attention
Stats
mcbal/deep-implicit-attention is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of deep-implicit-attention is Python.