ML-Optimizers-JAX
flaxOptimizers
Our great sponsors
ML-Optimizers-JAX | flaxOptimizers | |
---|---|---|
1 | 1 | |
40 | 28 | |
- | - | |
4.5 | 0.0 | |
almost 3 years ago | over 2 years ago | |
Python | Python | |
- | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ML-Optimizers-JAX
-
ML Optimizers from scratch using JAX
Github link (includes a link to a Kaggle notebook to run it directly) - shreyansh26/ML-Optimizers-JAX
flaxOptimizers
-
[P] Implementation of MADGRAD optimization algorithm for Tensorflow
For those who are interested, I have a Flax implementation of MADGRAD in flaxOptimizers (here). The optimizer solid and a refreshing departure from Adam-derived optimizers. One big caveat, however, is that you will need to tune your hyperparameters as they are likely to be orders of magnitude different from Adam's value.
What are some alternatives?
RAdam - On the Variance of the Adaptive Learning Rate and Beyond
opytimizer - 🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
DemonRangerOptimizer - Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
dm-haiku - JAX-based neural network library
trax - Trax — Deep Learning with Clear Code and Speed
AdasOptimizer - ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
dnn_from_scratch - A high level deep learning library for Convolutional Neural Networks,GANs and more, made from scratch(numpy/cupy implementation).
yaglm - A python package for penalized generalized linear models that supports fitting and model selection for structured, adaptive and non-convex penalties.