ExpensiveOptimBenchmark
vizier
ExpensiveOptimBenchmark | vizier | |
---|---|---|
1 | 5 | |
19 | 1,173 | |
- | 0.7% | |
3.9 | 9.3 | |
7 months ago | 6 days ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ExpensiveOptimBenchmark
-
29 Python real world optimization tutorials
For the problems with continous decision variables it is not trivial to come up with faster approaches on a modern many-core CPU. But even with discrete input (scheduling and planning) new continous optimizers can compete. The trick is to utilize parallel optimization runs and numba to perform around 1E6 fitness evaluations each second. Advantage is that it is much easier to create a fitness function than for instance to implement incremental score calculation in Optaplanner. And it is more flexible if you have to handle non-standard problems. For very expensive optimizations (like https://github.com/AlgTUDelft/ExpensiveOptimBenchmark) parallelization of fitness evaluation is more important than to use surrogate models.
vizier
- [N] Google Open Sources Vizier, Hyperparameter + Blackbox Optimization Service at Scale
- Is there any premade evolutionary algorithm selecting optimal NN architectures in TensorFlow ?
- Google just open sourced its Vizier optimisation suite
- Python-based research interface for blackbox and hyperparameter optimization
What are some alternatives?
fast-cma-es - A Python 3 gradient-free optimization library
mango - Parallel Hyperparameter Tuning in Python
parmoo - Python library for parallel multiobjective simulation optimization
SpaceDrones - A simple learning environment with space drones for evolution-inspired optimization.
Gradient-Free-Optimizers - Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
keras-tuner - A Hyperparameter Tuning Library for Keras
tune - An abstraction layer for parameter tuning
mlr3hyperband - Successive Halving and Hyperband in the mlr3 ecosystem
baybe - Bayesian Optimization and Design of Experiments
DIgging - Decision Intelligence for digging best parameters in target environment.
pyswarms - A research toolkit for particle swarm optimization in Python