curves-intersection-with-gradient-descent
ML-Optimizers-JAX
curves-intersection-with-gradient-descent | ML-Optimizers-JAX | |
---|---|---|
6 | 1 | |
3 | 40 | |
- | - | |
1.4 | 4.5 | |
12 months ago | almost 3 years ago | |
Python | Python | |
- | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
curves-intersection-with-gradient-descent
- Graph plotting as an optimisation problem
- I learnt how to plot graph of equations and their intersection using gradient descent
- Learnt how to plot graph of equations and their intersection using gradient descent
- Learnt how to plot graph of equations and their intersections using gradient descent
- Learnt how to plot equations and their intersection using Gradient descent
-
Learnt that plotting points of an equation can be treated as an optimisation problem.
[Webapp Link]: https://share.streamlit.io/vdivakar/curves-intersection-with-gradient-descent/main/app.py [GitHub Link]: https://github.com/vdivakar/curves-intersection-with-gradient-descent
ML-Optimizers-JAX
-
ML Optimizers from scratch using JAX
Github link (includes a link to a Kaggle notebook to run it directly) - shreyansh26/ML-Optimizers-JAX
What are some alternatives?
DiffMorph - Image morphing without reference points by applying warp maps and optimizing over them.
RAdam - On the Variance of the Adaptive Learning Rate and Beyond
Machine-Learning - A repository of my ML projects for Machine Learning course (SC349) at Northwestern University
DemonRangerOptimizer - Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
dm-haiku - JAX-based neural network library
trax - Trax — Deep Learning with Clear Code and Speed
AdasOptimizer - ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
dnn_from_scratch - A high level deep learning library for Convolutional Neural Networks,GANs and more, made from scratch(numpy/cupy implementation).
flaxOptimizers - A collection of optimizers, some arcane others well known, for Flax.
yaglm - A python package for penalized generalized linear models that supports fitting and model selection for structured, adaptive and non-convex penalties.