AdasOptimizer
ML-Optimizers-JAX
AdasOptimizer | ML-Optimizers-JAX | |
---|---|---|
2 | 1 | |
85 | 40 | |
- | - | |
5.8 | 4.5 | |
over 3 years ago | almost 3 years ago | |
C++ | Python | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
AdasOptimizer
-
[R] AdasOptimizer Update: Cifar-100+MobileNetV2 Adas generalizes with Adas 15% better and 9x faster than Adam
I think too, I was comfortable with posting this because it was the same code for each optimizer. https://github.com/YanaiEliyahu/AdasOptimizer/blob/master/misc/cifar-100-mobilenetv2/model_with_training.py.txt if you care to find what I did wrong, go for it.
- Optimizer obsoletes step-size scheduling, 100% on MNIST's training set 11 epochs
ML-Optimizers-JAX
-
ML Optimizers from scratch using JAX
Github link (includes a link to a Kaggle notebook to run it directly) - shreyansh26/ML-Optimizers-JAX
What are some alternatives?
DemonRangerOptimizer - Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
RAdam - On the Variance of the Adaptive Learning Rate and Beyond
imagenette - A smaller subset of 10 easily classified classes from Imagenet, and a little more French
pytorch-optimizer - torch-optimizer -- collection of optimizers for Pytorch
dm-haiku - JAX-based neural network library
tensorflow - An Open Source Machine Learning Framework for Everyone
trax - Trax — Deep Learning with Clear Code and Speed
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
dnn_from_scratch - A high level deep learning library for Convolutional Neural Networks,GANs and more, made from scratch(numpy/cupy implementation).