sru
liquid_time_constant_networks
sru | liquid_time_constant_networks | |
---|---|---|
1 | 7 | |
2,098 | 1,277 | |
0.0% | - | |
0.0 | 3.0 | |
over 2 years ago | 9 months ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
sru
-
[D] Language Models for Smaller Languages
This one? Looks like it's easy to use, which is nice!
liquid_time_constant_networks
- Liquid AI, a new MIT spinoff, wants to build an new type of AI
- Code Repository for Liquid Time-Constant Networks (LTCs)
-
LNNs - Liquid Neural Networks: Seeking general advice, papers, implementations
Here’s the paper for anyone that wants to save some googling. Liquid Time-constant Networks https://arxiv.org/abs/2006.04439
-
MIT solved a century-old differential equation to break 'liquid' AI's computational bottleneck
Liquid neural networks paper
- Neural networks know what they're doing
-
Could someone explain to me how Ramin Hasani derived the Fused Euler method?
Pics are from Liquid Time-Constant Networks
-
[R]Liquid Time-constant Networks: “..Instead of declaring a learning system’s dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates..”
Abstract: We introduce a new class of time-continuous recurrent neural network models. Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates. The resulting models represent dynamical systems with varying (i.e., liquid) time-constants coupled to their hidden state, with outputs being computed by numerical differential equation solvers. These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations, and give rise to improved performance on time-series prediction tasks. To demonstrate these properties, we first take a theoretical approach to find bounds over their dynamics and compute their expressive power by the trajectory length measure in latent trajectory space. We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to classical and modern RNNs. Code and data are available at this https URL
What are some alternatives?
text - Models, data loaders and abstractions for language processing, powered by PyTorch
CfC - Closed-form Continuous-time Neural Networks
best-of-ml-python - 🏆 A ranked list of awesome machine learning Python libraries. Updated weekly.
gateloop-transformer - Implementation of GateLoop Transformer in Pytorch and Jax
attention-is-all-you-need-pytorch - A PyTorch implementation of the Transformer model in "Attention is All You Need".
datasets - 🤗 The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation tools
chicksexer - A Python package for gender classification.
allennlp - An open-source NLP research library, built on PyTorch.
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.