Suggest an alternative to

AdasOptimizer

ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance

A URL to the alternative repo (e.g. GitHub, GitLab)

Here you can share your experience with the project you are suggesting or its comparison with AdasOptimizer. Optional.

A valid email to send you a verification link when necessary or log in.