|12 months ago||3 days ago|
|MIT License||MIT License|
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
We haven't tracked posts mentioning neptune-contrib yet.
Tracking mentions began in Dec 2020.
[D] Any research specific PyTorch based boilerplate code?
2 projects | reddit.com/r/MachineLearning | 20 Jun 2022
This lightning + hydra template is quite complete. Great for learning best practices.
What are some alternatives?
lightning-hydra-template - Deep Learning project template best practices with Pytorch Lightning, Hydra, Tensorboard.
pytorch_tempest - My repo for training neural nets using pytorch-lightning and hydra
lightning-transformers - Flexible components pairing 🤗 Transformers with Pytorch Lightning
traingenerator - 🧙 A web app to generate template code for machine learning
tensorflow - An Open Source Machine Learning Framework for Everyone
scikit-learn - scikit-learn: machine learning in Python
neptune-client - :ledger: Experiment tracking tool and model registry
xgboost - Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
Keras - Deep Learning for humans
gym - A toolkit for developing and comparing reinforcement learning algorithms.
MLflow - Open source platform for the machine learning lifecycle
H2O - H2O is an Open Source, Distributed, Fast & Scalable Machine Learning Platform: Deep Learning, Gradient Boosting (GBM) & XGBoost, Random Forest, Generalized Linear Modeling (GLM with Elastic Net), K-Means, PCA, Generalized Additive Models (GAM), RuleFit, Support Vector Machine (SVM), Stacked Ensembles, Automatic Machine Learning (AutoML), etc.