interactive-gp-visualization
BayesianOptimization
interactive-gp-visualization | BayesianOptimization | |
---|---|---|
1 | 5 | |
158 | 7,499 | |
- | 1.2% | |
3.0 | 5.5 | |
about 1 year ago | 12 days ago | |
Svelte | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
interactive-gp-visualization
-
What does the wiggly animation in this visualization of samples from GPs represent?
The source code for the visualization is available, but I'm not at all familiar with Svelte or D3.js so figured to it would be much easier to just ask around first instead.
BayesianOptimization
-
How best to compress a list of objective function evaluations in numerical optimization?
Yes but that’s a pretty broad label- is there a specific implementation you’re working with (for example ) that pinpoints the memory overhead you want to shrink?
- It's so fun and useful to me
-
[P] Bonsai: Bayesian Optimization for Gradient Boosted Trees
Sure, I’m only aware of the Bayesian Optimization package (https://github.com/fmfn/BayesianOptimization), but if you can recommend some other GP-based methods that integrate well with Gradient boosted machines, that would be nice.
-
How to optimize multiple variables to minimize the output?
I've previously used Bayesian optimisation for this kind of problem, if you're working in python this is a pretty great starting point (https://github.com/fmfn/BayesianOptimization). Black box optimisation is, to the best of my knowledge, a pretty large field and certainly a very difficult problem. You could certainly do a lot worse than BayesOpt.
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
This looks super interesting, I have previously considered using the Bayesian Optimization[0] package for some work, but the ability to switch out the underlying algorithms is appealing.
Perhaps a bit of a far out question - I would be interested in using this for optimizing real-world (ie slow, expensive, noisy) processes. A caveat with this is that the work is done in batches (eg N experiments at a time). Is there a mechanism by which I could feed in my results from previous rounds and have the algorithm suggest the next N configurations that are sufficiently uncorrelated to explore promising space without bunching on top of each-other? My immediate read is that I could use the package to pick the next optimal point, but would then have to lean on a random search for the remainder of the batch?
0: https://github.com/fmfn/BayesianOptimization
What are some alternatives?
opytimizer - 🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
nhentai-favorites-auto-pagination - This is an infinity randomly picker doujinshi from yours favorite list with auto scroll and pagination
ix - Simple dotfile pre-processor with a per-file configuration and no dependencies.
optimization-tutorial - Tutorials for the optimization techniques used in Gradient-Free-Optimizers and Hyperactive.
Hyperactive - An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
surrogate-models - A collection of surrogate models for sequence model based optimization techniques
Bayesian-Optimization-in-FSharp - Bayesian Optimization via Gaussian Processes in F#
pybobyqa - Python-based Derivative-Free Optimization with Bound Constraints
bonsai - Gradient Boosted Trees + Bayesian Optimization
DIgging - Decision Intelligence for digging best parameters in target environment.
draf - Demand Response Analysis Framework (DRAF)
jMetalPy - A framework for single/multi-objective optimization with metaheuristics