gateloop-transformer
Implementation of GateLoop Transformer in Pytorch and Jax (by lucidrains)
liquid_time_constant_networks
Code Repository for Liquid Time-Constant Networks (LTCs) (by raminmh)
gateloop-transformer | liquid_time_constant_networks | |
---|---|---|
1 | 7 | |
82 | 1,300 | |
- | - | |
8.9 | 1.4 | |
5 months ago | 12 days ago | |
Python | Python | |
MIT License | Apache License 2.0 |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gateloop-transformer
Posts with mentions or reviews of gateloop-transformer.
We have used some of these posts to build our list of alternatives
and similar projects.
-
GateLoop: Data-Controlled Linear Recurrence for Sequence Modeling
Lucidrains has a re-implementation of this: https://github.com/lucidrains/gateloop-transformer and was unable to beat the transformer baseline at equal numbers of parameters.
liquid_time_constant_networks
Posts with mentions or reviews of liquid_time_constant_networks.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-07-10.
- Liquid AI, a new MIT spinoff, wants to build an new type of AI
- Code Repository for Liquid Time-Constant Networks (LTCs)
-
LNNs - Liquid Neural Networks: Seeking general advice, papers, implementations
Here’s the paper for anyone that wants to save some googling. Liquid Time-constant Networks https://arxiv.org/abs/2006.04439
-
MIT solved a century-old differential equation to break 'liquid' AI's computational bottleneck
Liquid neural networks paper
- Neural networks know what they're doing
-
Could someone explain to me how Ramin Hasani derived the Fused Euler method?
Pics are from Liquid Time-Constant Networks
-
[R]Liquid Time-constant Networks: “..Instead of declaring a learning system’s dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates..”
Abstract: We introduce a new class of time-continuous recurrent neural network models. Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates. The resulting models represent dynamical systems with varying (i.e., liquid) time-constants coupled to their hidden state, with outputs being computed by numerical differential equation solvers. These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations, and give rise to improved performance on time-series prediction tasks. To demonstrate these properties, we first take a theoretical approach to find bounds over their dynamics and compute their expressive power by the trajectory length measure in latent trajectory space. We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to classical and modern RNNs. Code and data are available at this https URL
What are some alternatives?
When comparing gateloop-transformer and liquid_time_constant_networks you can also consider the following projects:
DALLE2-pytorch - Implementation of DALL-E 2, OpenAI's updated text-to-image synthesis neural network, in Pytorch
CfC - Closed-form Continuous-time Neural Networks
pytorch-lightning - The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate. [Moved to: https://github.com/PyTorchLightning/pytorch-lightning]
sru - Training RNNs as Fast as CNNs (https://arxiv.org/abs/1709.02755)
spaCy - 💫 Industrial-strength Natural Language Processing (NLP) in Python
pytorch-lightning - Build high-performance AI models with PyTorch Lightning (organized PyTorch). Deploy models with Lightning Apps (organized Python to build end-to-end ML systems). [Moved to: https://github.com/Lightning-AI/lightning]