Hyperactive
optuna-examples
Hyperactive | optuna-examples | |
---|---|---|
8 | 2 | |
490 | 596 | |
- | 3.5% | |
7.7 | 8.7 | |
5 months ago | 6 days ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Hyperactive
- Hyperactive Version 4.5 Released
- Hyperactive: An optimization and data collection toolbox for AutoML
- Hyperactive: Optimize computationally expensive models with powerful algorithms
- Show HN: Hyperactive – A highly versatile AutoML Toolbox
-
Hyperactive – Easy Neural Architecture Search for Deep Learning in Python
Check out the Neural Architecture Search Tutorial here: https://nbviewer.jupyter.org/github/SimonBlanke/hyperactive-...
Neural Architecture Search is just one of many optimization applications you can work on with Hyperactive. Check out the examples in the official github repository: https://github.com/SimonBlanke/Hyperactive/tree/master/examp...
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
Gradient-Free-Optimizers is a lightweight optimization package that serves as a backend for Hyperactive: https://github.com/SimonBlanke/Hyperactive
Hyperactive can do parallel computing with multiprocessing or joblib, or a custom wrapper-function.
optuna-examples
-
[D]How to optimize an ANN?
Check out the examples for Optuna, a popular hyper parameter tuning package. It has examples for most popular ML frameworks including Xgboost, so you can see how it compares to an ANN framework like Keras or PyTorch.
-
Data Scientists are dying out
That's still regular ML because you are in charge of the features. Optuna might make your life easier though: https://github.com/optuna/optuna-examples/blob/main/xgboost/xgboost_simple.py
What are some alternatives?
mango - Parallel Hyperparameter Tuning in Python
tqdm - :zap: A Fast, Extensible Progress Bar for Python and CLI
pybobyqa - Python-based Derivative-Free Optimization with Bound Constraints
optuna - A hyperparameter optimization framework
opytimizer - 🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
hyperopt - Distributed Asynchronous Hyperparameter Optimization in Python
OpenMetadata - Open Standard for Metadata. A Single place to Discover, Collaborate and Get your data right.
SMAC3 - SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization
optimization-tutorial - Tutorials for the optimization techniques used in Gradient-Free-Optimizers and Hyperactive.
nni - An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
anovos - Anovos - An Open Source Library for Scalable feature engineering Using Apache-Spark
Ray - Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.