Our great sponsors
-
For those who are interested, I have a Flax implementation of MADGRAD in flaxOptimizers (here). The optimizer solid and a refreshing departure from Adam-derived optimizers. One big caveat, however, is that you will need to tune your hyperparameters as they are likely to be orders of magnitude different from Adam's value.
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a more popular project.
Related posts
- Recognize Digits Using ML in Elixir
- [D] How to contribute to open source ML and DL without having access to high quality setup?
- Weekly updated open sourced model implementations in Flax
- Weekly updated open sourced deep learning model implementations in Flax
- Weekly updated open sourced model implementations in Flax