Gradient-Free-Optimizers
opytimizer
Gradient-Free-Optimizers | opytimizer | |
---|---|---|
11 | 7 | |
1,108 | 594 | |
- | - | |
5.0 | 5.5 | |
5 days ago | 5 months ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Gradient-Free-Optimizers
- Show HN: Gradient-Free-Optimizers supports constrained optimization in v1.3
- Gradient-Free-Optimizers version 1.2 released
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
I would be very disappointed if that were the case.. no, it looks like it’s set up to capture variance. The BO algo wraps an “Expected Improvement Optimizer”:
https://github.com/SimonBlanke/Gradient-Free-Optimizers/blob...
Which selects new points based on both the model’s mean estimate and its variance. See around line 58
-
Hacker News top posts: Feb 28, 2021
Gradient-Free-Optimizers A collection of modern optimization methods in Python\ (0 comments)
- SimonBlanke/Gradient-Free-Optimizers A collection of modern optimization methods in Python
- Gradient-Free-Optimizers: A collection of modern optimization methods in Python
- Optimize any Python function with modern algorithms in numerical search spaces
opytimizer
-
[P] Opytimizer: A Nature-Inspired Python Optimizer
We do have a Simulated Annealing version designed to work with the library's structure: https://github.com/gugarosa/opytimizer/blob/master/opytimizer/optimizers/science/sa.py
-
Opytimizer: A Nature-Inspired Python Optimizer
Opytimizer: A Nature-Inspired Python Optimizer
Thanks mate! That would be a great addition indeed. We do have some examples over https://github.com/gugarosa/opytimizer/tree/master/examples/..., but they are not clearly depicted as the README would be. Thanks a lot!!
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
There's also Opytimizer [0] for almost every metaheuristic optimization algorithm under the Sun.
[0] https://github.com/gugarosa/opytimizer
What are some alternatives?
Hyperactive - An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
BayesianOptimization - A Python implementation of global optimization with gaussian processes.
surrogate-models - A collection of surrogate models for sequence model based optimization techniques
pybobyqa - Python-based Derivative-Free Optimization with Bound Constraints
optimization-tutorial - Tutorials for the optimization techniques used in Gradient-Free-Optimizers and Hyperactive.
sqpdfo - Sequential-Quadratic-Programming Derivative-Free Optimization
urh - Universal Radio Hacker: Investigate Wireless Protocols Like A Boss
flaxOptimizers - A collection of optimizers, some arcane others well known, for Flax.
prima - PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. PRIMA means Reference Implementation for Powell's methods with Modernization and Amelioration, P for Powell.
lion-pytorch - 🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch