AdasOptimizer VS DemonRangerOptimizer

Compare AdasOptimizer vs DemonRangerOptimizer and see what are their differences.

AdasOptimizer

ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance (by YanaiEliyahu)

DemonRangerOptimizer

Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay (by JRC1995)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
AdasOptimizer DemonRangerOptimizer
2 1
85 23
- -
5.8 0.0
over 3 years ago over 3 years ago
C++ Python
MIT License -
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

AdasOptimizer

Posts with mentions or reviews of AdasOptimizer. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-01-15.

DemonRangerOptimizer

Posts with mentions or reviews of DemonRangerOptimizer. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-01-15.
  • [R] AdasOptimizer Update: Cifar-100+MobileNetV2 Adas generalizes with Adas 15% better and 9x faster than Adam
    4 projects | /r/MachineLearning | 15 Jan 2021
    The results are interesting, but in terms of novelty of the main theory - isn't it almost identical to Baydin et al.? https://arxiv.org/pdf/1703.04782.pdf It seems the difference may be in some implementation details, like using a running average for the past gradient. If it's useful, I implemented a bunch of optimizers with options to synergize different techniques (https://github.com/JRC1995/DemonRangerOptimizer) including hypergradient updates for stuffs (and taking into account decorrelated weight decay and per-parameter lrs for hypergradient lr) when I was bored before practically abandoning it all together. I didn't really run any experiments with it though, but some people tried although they may not have got any particularly striking results.

What are some alternatives?

When comparing AdasOptimizer and DemonRangerOptimizer you can also consider the following projects:

ML-Optimizers-JAX - Toy implementations of some popular ML optimizers using Python/JAX

pytorch-optimizer - torch-optimizer -- collection of optimizers for Pytorch

imagenette - A smaller subset of 10 easily classified classes from Imagenet, and a little more French

tensorflow - An Open Source Machine Learning Framework for Everyone

Gradient-Centralization-TensorFlow - Instantly improve your training performance of TensorFlow models with just 2 lines of code!

Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration

RAdam - On the Variance of the Adaptive Learning Rate and Beyond

Caffe - Caffe: a fast open framework for deep learning.