corgi
DiffSharp
corgi | DiffSharp | |
---|---|---|
2 | 2 | |
23 | 573 | |
- | 0.9% | |
0.0 | 4.6 | |
over 2 years ago | 16 days ago | |
Rust | F# | |
MIT License | BSD 2-clause "Simplified" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
corgi
-
Corgi: Rust neural network/dynamic automatic differentiation library I have been working on
Fully-connected neural network: https://github.com/patricksongzy/corgi/blob/main/src/dense.rs
-
[P] Corgi: Rust neural network/dynamic automatic differentiation library I have been working on
Github: https://github.com/patricksongzy/corgi
DiffSharp
- Automatic differentiation in a lot more than 38 lines of F#
-
Neural Networks Fsharp
Yes. You can use TensorFlow.NET or DiffSharp.
What are some alternatives?
Java-Machine-Learning - Deep learning library for Java, with fully connected, convolutional, and recurrent layers. Also features many gradient descent optimization algorithms.
TensorFlow.NET - .NET Standard bindings for Google's TensorFlow for developing, training and deploying Machine Learning models in C# and F#.
autograd-rs - An autograd implementation in Rust
dfdx - Deep learning in Rust, with shape checked tensors and neural networks
burn - Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
ocaml-torch - OCaml bindings for PyTorch
hyperlearn - 2-2000x faster ML algos, 50% less memory usage, works on all hardware - new and old.