pytorch-optimizer VS DemonRangerOptimizer

Compare pytorch-optimizer vs DemonRangerOptimizer and see what are their differences.

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
pytorch-optimizer DemonRangerOptimizer
3 1
2,946 23
- -
3.1 0.0
about 1 month ago over 3 years ago
Python Python
Apache License 2.0 -
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

pytorch-optimizer

Posts with mentions or reviews of pytorch-optimizer. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-10-28.

DemonRangerOptimizer

Posts with mentions or reviews of DemonRangerOptimizer. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-01-15.
  • [R] AdasOptimizer Update: Cifar-100+MobileNetV2 Adas generalizes with Adas 15% better and 9x faster than Adam
    4 projects | /r/MachineLearning | 15 Jan 2021
    The results are interesting, but in terms of novelty of the main theory - isn't it almost identical to Baydin et al.? https://arxiv.org/pdf/1703.04782.pdf It seems the difference may be in some implementation details, like using a running average for the past gradient. If it's useful, I implemented a bunch of optimizers with options to synergize different techniques (https://github.com/JRC1995/DemonRangerOptimizer) including hypergradient updates for stuffs (and taking into account decorrelated weight decay and per-parameter lrs for hypergradient lr) when I was bored before practically abandoning it all together. I didn't really run any experiments with it though, but some people tried although they may not have got any particularly striking results.

What are some alternatives?

When comparing pytorch-optimizer and DemonRangerOptimizer you can also consider the following projects:

sam - SAM: Sharpness-Aware Minimization (PyTorch)

ML-Optimizers-JAX - Toy implementations of some popular ML optimizers using Python/JAX

VQGAN-CLIP - Just playing with getting VQGAN+CLIP running locally, rather than having to use colab.

AdasOptimizer - ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance

imagenette - A smaller subset of 10 easily classified classes from Imagenet, and a little more French

simple-sam - Sharpness-Aware Minimization for Efficiently Improving Generalization

Gradient-Centralization-TensorFlow - Instantly improve your training performance of TensorFlow models with just 2 lines of code!

RAdam - On the Variance of the Adaptive Learning Rate and Beyond

PythonPID_Tuner - Python PID Tuner - Based on a FOPDT model obtained using a Open Loop Process Reaction Curve