pytorch-optimizer
PythonPID_Tuner
pytorch-optimizer | PythonPID_Tuner | |
---|---|---|
3 | 3 | |
2,946 | 6 | |
- | - | |
3.1 | 6.1 | |
about 1 month ago | over 2 years ago | |
Python | Python | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pytorch-optimizer
-
[D]: Implementation: Deconvolutional Paragraph Representation Learning
The specific implementation is from (here)[https://github.com/jettify/pytorch-optimizer] since pytorch doesn't have it directly.
- VQGAN+CLIP : "RAdam" from torch_optimizer could not be imported ?
- [R] AdasOptimizer Update: Cifar-100+MobileNetV2 Adas generalizes with Adas 15% better and 9x faster than Adam
PythonPID_Tuner
What are some alternatives?
sam - SAM: Sharpness-Aware Minimization (PyTorch)
AeroVECTOR - Model Rocket Simulator oriented to the design and tuning of active control systems, be them in the form of TVC, Active Fin Control or just parachute deployment algorithms on passively stable rockets. It is able to simulate non-linear actuator dynamics and has some limited Software in the Loop capabilities. The program computes all the subsonic aerodynamic parameters of interest and integrates the 3DOF Equations of Motion to simulate the complete flight.
DemonRangerOptimizer - Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
pytuneOPC - PID Logger, Tuner and FOPDT Simulator using OPC-UA
VQGAN-CLIP - Just playing with getting VQGAN+CLIP running locally, rather than having to use colab.
imagenette - A smaller subset of 10 easily classified classes from Imagenet, and a little more French
simple-sam - Sharpness-Aware Minimization for Efficiently Improving Generalization
RAdam - On the Variance of the Adaptive Learning Rate and Beyond
AdasOptimizer - ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance