ExpensiveOptimBenchmark VS BayesianOptimization

Compare ExpensiveOptimBenchmark vs BayesianOptimization and see what are their differences.

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
ExpensiveOptimBenchmark BayesianOptimization
1 5
19 7,499
- 1.2%
3.9 5.5
7 months ago 9 days ago
Python Python
MIT License MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

ExpensiveOptimBenchmark

Posts with mentions or reviews of ExpensiveOptimBenchmark. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-07-14.
  • 29 Python real world optimization tutorials
    2 projects | /r/optimization | 14 Jul 2022
    For the problems with continous decision variables it is not trivial to come up with faster approaches on a modern many-core CPU. But even with discrete input (scheduling and planning) new continous optimizers can compete. The trick is to utilize parallel optimization runs and numba to perform around 1E6 fitness evaluations each second. Advantage is that it is much easier to create a fitness function than for instance to implement incremental score calculation in Optaplanner. And it is more flexible if you have to handle non-standard problems. For very expensive optimizations (like https://github.com/AlgTUDelft/ExpensiveOptimBenchmark) parallelization of fitness evaluation is more important than to use surrogate models.

BayesianOptimization

Posts with mentions or reviews of BayesianOptimization. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-01-26.
  • How best to compress a list of objective function evaluations in numerical optimization?
    1 project | /r/askmath | 14 Jul 2022
    Yes but that’s a pretty broad label- is there a specific implementation you’re working with (for example ) that pinpoints the memory overhead you want to shrink?
  • It's so fun and useful to me
    2 projects | /r/ProgrammerHumor | 26 Jan 2022
  • [P] Bonsai: Bayesian Optimization for Gradient Boosted Trees
    2 projects | /r/MachineLearning | 18 Jul 2021
    Sure, I’m only aware of the Bayesian Optimization package (https://github.com/fmfn/BayesianOptimization), but if you can recommend some other GP-based methods that integrate well with Gradient boosted machines, that would be nice.
  • How to optimize multiple variables to minimize the output?
    1 project | /r/bioinformatics | 30 Jun 2021
    I've previously used Bayesian optimisation for this kind of problem, if you're working in python this is a pretty great starting point (https://github.com/fmfn/BayesianOptimization). Black box optimisation is, to the best of my knowledge, a pretty large field and certainly a very difficult problem. You could certainly do a lot worse than BayesOpt.
  • Gradient-Free-Optimizers A collection of modern optimization methods in Python
    9 projects | news.ycombinator.com | 28 Feb 2021
    This looks super interesting, I have previously considered using the Bayesian Optimization[0] package for some work, but the ability to switch out the underlying algorithms is appealing.

    Perhaps a bit of a far out question - I would be interested in using this for optimizing real-world (ie slow, expensive, noisy) processes. A caveat with this is that the work is done in batches (eg N experiments at a time). Is there a mechanism by which I could feed in my results from previous rounds and have the algorithm suggest the next N configurations that are sufficiently uncorrelated to explore promising space without bunching on top of each-other? My immediate read is that I could use the package to pick the next optimal point, but would then have to lean on a random search for the remainder of the batch?

    0: https://github.com/fmfn/BayesianOptimization

What are some alternatives?

When comparing ExpensiveOptimBenchmark and BayesianOptimization you can also consider the following projects:

fast-cma-es - A Python 3 gradient-free optimization library

opytimizer - 🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.

parmoo - Python library for parallel multiobjective simulation optimization

nhentai-favorites-auto-pagination - This is an infinity randomly picker doujinshi from yours favorite list with auto scroll and pagination

ix - Simple dotfile pre-processor with a per-file configuration and no dependencies.

optimization-tutorial - Tutorials for the optimization techniques used in Gradient-Free-Optimizers and Hyperactive.

Hyperactive - An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.

surrogate-models - A collection of surrogate models for sequence model based optimization techniques

Bayesian-Optimization-in-FSharp - Bayesian Optimization via Gaussian Processes in F#

pybobyqa - Python-based Derivative-Free Optimization with Bound Constraints

bonsai - Gradient Boosted Trees + Bayesian Optimization

DIgging - Decision Intelligence for digging best parameters in target environment.