nni
optuna
Our great sponsors
nni | optuna | |
---|---|---|
5 | 34 | |
13,726 | 9,640 | |
0.9% | 3.4% | |
6.7 | 9.9 | |
about 2 months ago | 2 days ago | |
Python | Python | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
nni
- Filter Pruning for PyTorch
-
Automated Machine Learning (AutoML) - 9 Different Ways with Microsoft AI
For a complete tutorial, navigate to this Jupyter Notebook: https://github.com/microsoft/nni/blob/master/examples/notebooks/tabular_data_classification_in_AML.ipynb
-
[D] Efficient ways of choosing number of layers/neurons in a neural network
optuna, hyperopt, nni, plenty of less-known tools too.
-
Top 10 Developer Trends, Sun Oct 18 2020
microsoft / nni
optuna
-
Optuna – A Hyperparameter Optimization Framework
I didn’t even know WandB did hyperparameter optimization, I figured it was a neural network visualizer based on 2 minute papers. Didn’t seem like many alternatives out there to Optuna with TPE + persistence in conditional continuous & discrete spaces.
Anyway, it’s doable to make a multi objective decide_to_prune function with Optuna, here’s an example https://github.com/optuna/optuna/issues/3450#issuecomment-19...
- How to test optimal parameters
- FOSS hyperparameter optimization framework to automate hyperparameter search
-
How did you make that?!
The network configuration process is usually not particularly scientific and mostly relies on empirical observation. For some cases, tools like Optuna can be used to automatically find the optimal parameters. In others, on others, you can look for modern studies which explore the effect of this parameter on performance, such as this study (2022), but these are typically very specific to one particular architecture.
-
[P] We are building a curated list of open source tooling for data-centric AI workflows, looking for contributions.
Keras Tuner, Optuna : https://github.com/optuna/optuna ?
- How to tune more than 2 hyperparameters in Grid Search in Python?
-
Suggestion to optimize algo
I have used OpenTuner, but I don't think it is maintained anymore. I hear tell that Optuna is what to use now, but have not used it myself. https://optuna.org Optuna - A hyperparameter optimization framework
-
Best practices for training PyTorch model
Research the type of model to get an idea of what hyper parameters to use. I recommend using a hyper parameter optimization library like Optuna to get the best configuration
-
[D]How to optimize an ANN?
You can use Optuna, SMAC or hyperopt
What are some alternatives?
FLAML - A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.
Ray - Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
autogluon - AutoGluon: Fast and Accurate ML in 3 Lines of Code
hyperopt - Distributed Asynchronous Hyperparameter Optimization in Python
AutoML - This is a collection of our NAS and Vision Transformer work. [Moved to: https://github.com/microsoft/Cream]
rl-baselines3-zoo - A training framework for Stable Baselines3 reinforcement learning agents, with hyperparameter optimization and pre-trained agents included.
mljar-supervised - Python package for AutoML on Tabular Data with Feature Engineering, Hyper-Parameters Tuning, Explanations and Automatic Documentation
archai - Accelerate your Neural Architecture Search (NAS) through fast, reproducible and modular research.
pyGAM - [HELP REQUESTED] Generalized Additive Models in Python
automlbenchmark - OpenML AutoML Benchmarking Framework
pg_plan_advsr - PostgreSQL extension for automated execution plan tuning