Gradient-Free-Optimizers
optimization-tutorial
Gradient-Free-Optimizers | optimization-tutorial | |
---|---|---|
11 | 1 | |
1,108 | 17 | |
- | - | |
5.0 | 0.0 | |
5 days ago | about 2 years ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Gradient-Free-Optimizers
- Show HN: Gradient-Free-Optimizers supports constrained optimization in v1.3
- Gradient-Free-Optimizers version 1.2 released
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
I would be very disappointed if that were the case.. no, it looks like it’s set up to capture variance. The BO algo wraps an “Expected Improvement Optimizer”:
https://github.com/SimonBlanke/Gradient-Free-Optimizers/blob...
Which selects new points based on both the model’s mean estimate and its variance. See around line 58
-
Hacker News top posts: Feb 28, 2021
Gradient-Free-Optimizers A collection of modern optimization methods in Python\ (0 comments)
- SimonBlanke/Gradient-Free-Optimizers A collection of modern optimization methods in Python
- Gradient-Free-Optimizers: A collection of modern optimization methods in Python
- Optimize any Python function with modern algorithms in numerical search spaces
optimization-tutorial
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
I will look into this algorithm. Thanks for the suggestion. I have some basic explanations of the optimization techniques and their parameters in a separate repository: https://github.com/SimonBlanke/optimization-tutorial
But there is still a lot of work to be done.
What are some alternatives?
Hyperactive - An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
opytimizer - 🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
surrogate-models - A collection of surrogate models for sequence model based optimization techniques
BayesianOptimization - A Python implementation of global optimization with gaussian processes.
pybobyqa - Python-based Derivative-Free Optimization with Bound Constraints
urh - Universal Radio Hacker: Investigate Wireless Protocols Like A Boss
sigopt-server - Open Source version of SigOpt API, performing hyperparameter optimization and visualization
prima - PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. PRIMA means Reference Implementation for Powell's methods with Modernization and Amelioration, P for Powell.