sqpdfo
Gradient-Free-Optimizers
sqpdfo | Gradient-Free-Optimizers | |
---|---|---|
1 | 11 | |
13 | 1,114 | |
- | - | |
0.0 | 5.0 | |
over 1 year ago | 5 days ago | |
Python | Python | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
sqpdfo
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
If you have an expensive but not high dimensional problem you might want to try https://github.com/DLR-SC/sqpdfo .
Gradient-Free-Optimizers
- Show HN: Gradient-Free-Optimizers supports constrained optimization in v1.3
- Gradient-Free-Optimizers version 1.2 released
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
I would be very disappointed if that were the case.. no, it looks like it’s set up to capture variance. The BO algo wraps an “Expected Improvement Optimizer”:
https://github.com/SimonBlanke/Gradient-Free-Optimizers/blob...
Which selects new points based on both the model’s mean estimate and its variance. See around line 58
-
Hacker News top posts: Feb 28, 2021
Gradient-Free-Optimizers A collection of modern optimization methods in Python\ (0 comments)
- SimonBlanke/Gradient-Free-Optimizers A collection of modern optimization methods in Python
- Gradient-Free-Optimizers: A collection of modern optimization methods in Python
- Optimize any Python function with modern algorithms in numerical search spaces
What are some alternatives?
Hyperactive - An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
opytimizer - 🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
optimization-tutorial - Tutorials for the optimization techniques used in Gradient-Free-Optimizers and Hyperactive.
surrogate-models - A collection of surrogate models for sequence model based optimization techniques
pybobyqa - Python-based Derivative-Free Optimization with Bound Constraints
urh - Universal Radio Hacker: Investigate Wireless Protocols Like A Boss
prima - PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. PRIMA means Reference Implementation for Powell's methods with Modernization and Amelioration, P for Powell.
PSO-cont-sched - Made for a college project, this Java program attempts to demonstrate how PSO might be used to solve container scheduling problems.
RocketLander - A simple framework equipped with optimization algorithms, such as reinforcement learning, evolution strategies, genetic optimization, and simulated annealing, to enable an orbital rocket booster to land autonomously.