Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Top 14 Autodiff Open-Source Projects
-
burn
Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
uncertainties
Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
exprgrad
An experimental deep learning framework for Nim based on a differentiable array programming language
-
AutoDiff
A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions. (by alexshtf)
-
Mission : Impossible (AutoDiff)
A concise C++17 implementation of automatic differentiation (operator overloading)
-
memoized_coduals
Shows that it is possible to implement reverse mode autodiff using a variation on the dual numbers called the codual numbers
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Project mention: 3 years of fulltime Rust game development, and why we're leaving Rust behind | news.ycombinator.com | 2024-04-26You can use libtorch directly via `tch-rs`, and at present I'm porting over to Burn (see https://burn.dev) which appears incredibly promising. My impression is it's in a good place, if of course not close to the ecosystem of Python/C++. At very least I've gotten my nn models training and running without too much difficulty. (I'm moving to Burn for the thread safety - their `Tensor` impl is `Sync` - libtorch doesn't have such a guarantee.)
Burn has Candle as one of its backends, which I understand is also quite popular.
You can implement autograd as a library. Just take a look at this
https://github.com/sradc/SmallPebble
The first line of the description is:
> SmallPebble is a minimal automatic differentiation and deep learning library written from scratch in Python, using NumPy/CuPy.
Project mention: Tiny-autodiff: A tiny autograd library made for educational purposes in D | news.ycombinator.com | 2024-04-12
Autodiff related posts
-
Burn: Deep Learning Framework built using Rust
-
Transitioning From PyTorch to Burn
-
Burn Deep Learning Framework Release 0.12.0 Improved API and PyTorch Integration
-
Supercharge Web AI Model Testing: WebGPU, WebGL, and Headless Chrome
-
Fastest Autograd in the West
-
Burn Deep Learning Framework 0.11.0 Released: Just-in-Time Automatic Kernel Fusion & Founding Announcement
-
Burn Deep Learning Framework v0.11.0 Released: Just-in-Time Kernel Fusion
-
A note from our sponsor - InfluxDB
www.influxdata.com | 4 May 2024
Index
What are some of the best open-source Autodiff projects? This list will help you:
Project | Stars | |
---|---|---|
1 | burn | 7,074 |
2 | dfdx | 1,611 |
3 | autodiff | 1,534 |
4 | DiffSharp | 574 |
5 | uncertainties | 538 |
6 | MyGrad | 186 |
7 | cl-waffe2 | 116 |
8 | exprgrad | 113 |
9 | SmallPebble | 112 |
10 | FastAD | 91 |
11 | AutoDiff | 85 |
12 | Mission : Impossible (AutoDiff) | 20 |
13 | tiny-autodiff | 6 |
14 | memoized_coduals | 3 |
Sponsored