AD-Rosetta-Stone
SmallPebble
AD-Rosetta-Stone | SmallPebble | |
---|---|---|
2 | 6 | |
26 | 112 | |
- | - | |
10.0 | 0.0 | |
almost 6 years ago | over 1 year ago | |
Scala | Python | |
GNU General Public License v3.0 only | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
AD-Rosetta-Stone
-
Understanding Automatic Differentiation in 30 lines of Python
[1] https://github.com/qobi/AD-Rosetta-Stone/
-
Autodidax: Jax Core from Scratch (In Python)
I find the solutions from https://github.com/qobi/AD-Rosetta-Stone/ to be very helpful, particularly for representing forward and backward mode automatic differentiation using a functional approach.
I used this code as inspiration for a functional-only (without references/pointers) in Mercury: https://github.com/mclements/mercury-ad
SmallPebble
-
Fastest Autograd in the West
You can implement autograd as a library. Just take a look at this
https://github.com/sradc/SmallPebble
The first line of the description is:
> SmallPebble is a minimal automatic differentiation and deep learning library written from scratch in Python, using NumPy/CuPy.
-
Compiling ML models to C for fun
Thanks for this. My approach to speeding up an autodiff system like this was to write it in terms of nd-arrays rather than scalars, using numpy/cupy [1]. But it's still slower than deep learning frameworks that compile / fuse operations. Wondering how it compares to the approach in this post. (Might try to benchmark at some point.)
[1] https://github.com/sradc/SmallPebble
- Understanding Automatic Differentiation in 30 lines of Python
-
[P] SmallPebble - minimal(/toy) deep learning framework written from scratch in Python, using NumPy/CuPy. <700 loc.
Located here: https://github.com/sradc/SmallPebble
- Show HN: I wrote a minimal(/toy) deep learning library from scratch in Python
- SmallPebble – Minimal automatic differentiation implementation in Python, NumPy
What are some alternatives?
mercury-ad - Mercury library for automatic differentiation
MyGrad - Drop-in autodiff for NumPy.
autograd - Efficiently computes derivatives of numpy code.
chainer - A flexible framework of neural networks for deep learning
autodidact - A pedagogical implementation of Autograd
memoized_coduals - Shows that it is possible to implement reverse mode autodiff using a variation on the dual numbers called the codual numbers
Tensor-Puzzles - Solve puzzles. Improve your pytorch.
owl - Owl - OCaml Scientific Computing @ https://ocaml.xyz
GPU-Puzzles - Solve puzzles. Learn CUDA.