Hyperactive
Gradient-Free-Optimizers
Hyperactive | Gradient-Free-Optimizers | |
---|---|---|
8 | 11 | |
490 | 1,103 | |
- | - | |
7.7 | 5.0 | |
5 months ago | about 1 month ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Hyperactive
- Hyperactive Version 4.5 Released
- Hyperactive: An optimization and data collection toolbox for AutoML
- Hyperactive: Optimize computationally expensive models with powerful algorithms
- Show HN: Hyperactive – A highly versatile AutoML Toolbox
-
Hyperactive – Easy Neural Architecture Search for Deep Learning in Python
Check out the Neural Architecture Search Tutorial here: https://nbviewer.jupyter.org/github/SimonBlanke/hyperactive-...
Neural Architecture Search is just one of many optimization applications you can work on with Hyperactive. Check out the examples in the official github repository: https://github.com/SimonBlanke/Hyperactive/tree/master/examp...
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
Gradient-Free-Optimizers is a lightweight optimization package that serves as a backend for Hyperactive: https://github.com/SimonBlanke/Hyperactive
Hyperactive can do parallel computing with multiprocessing or joblib, or a custom wrapper-function.
Gradient-Free-Optimizers
- Show HN: Gradient-Free-Optimizers supports constrained optimization in v1.3
- Gradient-Free-Optimizers version 1.2 released
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
I would be very disappointed if that were the case.. no, it looks like it’s set up to capture variance. The BO algo wraps an “Expected Improvement Optimizer”:
https://github.com/SimonBlanke/Gradient-Free-Optimizers/blob...
Which selects new points based on both the model’s mean estimate and its variance. See around line 58
-
Hacker News top posts: Feb 28, 2021
Gradient-Free-Optimizers A collection of modern optimization methods in Python\ (0 comments)
- SimonBlanke/Gradient-Free-Optimizers A collection of modern optimization methods in Python
- Gradient-Free-Optimizers: A collection of modern optimization methods in Python
- Optimize any Python function with modern algorithms in numerical search spaces
What are some alternatives?
mango - Parallel Hyperparameter Tuning in Python
opytimizer - 🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
pybobyqa - Python-based Derivative-Free Optimization with Bound Constraints
surrogate-models - A collection of surrogate models for sequence model based optimization techniques
OpenMetadata - Open Standard for Metadata. A Single place to Discover, Collaborate and Get your data right.
optimization-tutorial - Tutorials for the optimization techniques used in Gradient-Free-Optimizers and Hyperactive.
optuna-examples - Examples for https://github.com/optuna/optuna
urh - Universal Radio Hacker: Investigate Wireless Protocols Like A Boss
prima - PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. PRIMA means Reference Implementation for Powell's methods with Modernization and Amelioration, P for Powell.