spaceopt
optuna-examples
spaceopt | optuna-examples | |
---|---|---|
1 | 2 | |
42 | 601 | |
- | 4.3% | |
3.6 | 8.7 | |
almost 2 years ago | 6 days ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
spaceopt
-
[D] What kind of Hyperparameter Optimisation do you use?
I had exactly the same problem: How do I naturally integrate human expertise or manual interventions into automatic hyperparameter optimization process? I ended up writing my own algorithm that helps me achieve that: https://github.com/ar-nowaczynski/spaceopt
optuna-examples
-
[D]How to optimize an ANN?
Check out the examples for Optuna, a popular hyper parameter tuning package. It has examples for most popular ML frameworks including Xgboost, so you can see how it compares to an ANN framework like Keras or PyTorch.
-
Data Scientists are dying out
That's still regular ML because you are in charge of the features. Optuna might make your life easier though: https://github.com/optuna/optuna-examples/blob/main/xgboost/xgboost_simple.py
What are some alternatives?
optuna - A hyperparameter optimization framework
tqdm - :zap: A Fast, Extensible Progress Bar for Python and CLI
Spearmint - Spearmint Bayesian optimization codebase
Hyperactive - An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
Ray - Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
hyperopt - Distributed Asynchronous Hyperparameter Optimization in Python
nni - An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
SMAC3 - SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization