BayesianOptimization
opytimizer
BayesianOptimization | opytimizer | |
---|---|---|
5 | 7 | |
7,499 | 594 | |
1.2% | - | |
5.5 | 5.5 | |
12 days ago | 5 months ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
BayesianOptimization
-
How best to compress a list of objective function evaluations in numerical optimization?
Yes but that’s a pretty broad label- is there a specific implementation you’re working with (for example ) that pinpoints the memory overhead you want to shrink?
- It's so fun and useful to me
-
[P] Bonsai: Bayesian Optimization for Gradient Boosted Trees
Sure, I’m only aware of the Bayesian Optimization package (https://github.com/fmfn/BayesianOptimization), but if you can recommend some other GP-based methods that integrate well with Gradient boosted machines, that would be nice.
-
How to optimize multiple variables to minimize the output?
I've previously used Bayesian optimisation for this kind of problem, if you're working in python this is a pretty great starting point (https://github.com/fmfn/BayesianOptimization). Black box optimisation is, to the best of my knowledge, a pretty large field and certainly a very difficult problem. You could certainly do a lot worse than BayesOpt.
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
This looks super interesting, I have previously considered using the Bayesian Optimization[0] package for some work, but the ability to switch out the underlying algorithms is appealing.
Perhaps a bit of a far out question - I would be interested in using this for optimizing real-world (ie slow, expensive, noisy) processes. A caveat with this is that the work is done in batches (eg N experiments at a time). Is there a mechanism by which I could feed in my results from previous rounds and have the algorithm suggest the next N configurations that are sufficiently uncorrelated to explore promising space without bunching on top of each-other? My immediate read is that I could use the package to pick the next optimal point, but would then have to lean on a random search for the remainder of the batch?
0: https://github.com/fmfn/BayesianOptimization
opytimizer
-
[P] Opytimizer: A Nature-Inspired Python Optimizer
We do have a Simulated Annealing version designed to work with the library's structure: https://github.com/gugarosa/opytimizer/blob/master/opytimizer/optimizers/science/sa.py
-
Opytimizer: A Nature-Inspired Python Optimizer
Opytimizer: A Nature-Inspired Python Optimizer
Thanks mate! That would be a great addition indeed. We do have some examples over https://github.com/gugarosa/opytimizer/tree/master/examples/..., but they are not clearly depicted as the README would be. Thanks a lot!!
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
There's also Opytimizer [0] for almost every metaheuristic optimization algorithm under the Sun.
[0] https://github.com/gugarosa/opytimizer
What are some alternatives?
nhentai-favorites-auto-pagination - This is an infinity randomly picker doujinshi from yours favorite list with auto scroll and pagination
Hyperactive - An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
ix - Simple dotfile pre-processor with a per-file configuration and no dependencies.
optimization-tutorial - Tutorials for the optimization techniques used in Gradient-Free-Optimizers and Hyperactive.
Gradient-Free-Optimizers - Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
sqpdfo - Sequential-Quadratic-Programming Derivative-Free Optimization
surrogate-models - A collection of surrogate models for sequence model based optimization techniques
flaxOptimizers - A collection of optimizers, some arcane others well known, for Flax.
Bayesian-Optimization-in-FSharp - Bayesian Optimization via Gaussian Processes in F#
lion-pytorch - 🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch