mlpack
autodiff
Our great sponsors
mlpack | autodiff | |
---|---|---|
4 | 7 | |
4,742 | 1,517 | |
1.6% | 3.4% | |
9.9 | 7.5 | |
4 days ago | 8 days ago | |
C++ | C++ | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
mlpack
-
What is the most used library for AI in C++ ?
mlpack is a great library for machine learning in C++. It's very fast and not too much of a learning curve.
-
Ensmallen: A C++ Library for Efficient Numerical Optimization
This toolkit was originally part of the mlpack machine learning library (https://github.com/mlpack/mlpack) before it was split out into a separate, standalone effort.
-
Top 10 Python Libraries for Machine Learning
Github Repository: https://github.com/mlpack/mlpack Developed By: Community, supported by Georgia Institute of technology Primary purpose: Multiple ML Models and Algorithms
autodiff
- The Elements of Differentiable Programming
-
Astray: A performance-portable geodesic ray tracing library.
I completely agree. Specifying the metric rather than the Christoffel symbols would make it much easier for the users. Something like https://github.com/autodiff/autodiff might just work as the metric tensor is made up of primitives.
-
Gradients Without Backpropagation
Forward-mode differentiation is easy to implement in C++ with templates, operator overloading, and dual numbers (https://en.wikipedia.org/wiki/Automatic_differentiation#Auto...). Some libraries such as autodiff (https://github.com/autodiff/autodiff) and CppAD (https://github.com/coin-or/CppAD) use this method.
- Ensmallen: A C++ Library for Efficient Numerical Optimization
-
I am creating a fast, header-only, C++ library for control algorithms
I was thinking of adding [autodiff](https://github.com/autodiff/autodiff) in the future, mainly because it works seamlessly with *Eigen*. One big advantage would be that I could use it for AD for NonlinearSystems as well.
What are some alternatives?
tensorflow - An Open Source Machine Learning Framework for Everyone
Dlib - A toolkit for making real world machine learning and data analysis applications in C++
SHOGUN - ShÅgun
xgboost - Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
Caffe - Caffe: a fast open framework for deep learning.
mxnet - Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
examples - TensorFlow examples
CppAD - A C++ Algorithmic Differentiation Package: Home Page
NN++ - A small and easy to use neural net implementation for C++. Just download and #include!
RNNLIB - RNNLIB is a recurrent neural network library for sequence learning problems. Forked from Alex Graves work http://sourceforge.net/projects/rnnl/
FastAD - FastAD is a C++ implementation of automatic differentiation both forward and reverse mode.
Fido - A lightweight C++ machine learning library for embedded electronics and robotics.