Tensor-Puzzles
mercury-ad
Tensor-Puzzles | mercury-ad | |
---|---|---|
9 | 2 | |
2,481 | 6 | |
- | - | |
4.4 | 10.0 | |
about 2 months ago | over 1 year ago | |
Jupyter Notebook | Mercury | |
MIT License | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Tensor-Puzzles
- Tensor Puzzles
- Understanding Automatic Differentiation in 30 lines of Python
- PyTorch Broadcasting Puzzles
- Srush/Tensor-Puzzles: Solve puzzles. Improve your PyTorch
-
[D] What are some resources to brush up on my PyTorch skills?
Probably not a 1:1 to a koan but these are some neat PyTorch puzzles: https://github.com/srush/Tensor-Puzzles
- GitHub - srush/Tensor-Puzzles
mercury-ad
-
Understanding Automatic Differentiation in 30 lines of Python
I wrote a purely functional AD library in Mercury [0], which adapts a general approach from [1]. I believe that Owl provides a similar approach [2].
[0] https://github.com/mclements/mercury-ad
-
Autodidax: Jax Core from Scratch (In Python)
I find the solutions from https://github.com/qobi/AD-Rosetta-Stone/ to be very helpful, particularly for representing forward and backward mode automatic differentiation using a functional approach.
I used this code as inspiration for a functional-only (without references/pointers) in Mercury: https://github.com/mclements/mercury-ad
What are some alternatives?
GPU-Puzzles - Solve puzzles. Learn CUDA.
picoGPT - An unnecessarily tiny implementation of GPT-2 in NumPy.
AD-Rosetta-Stone - Examples of Automatic Differentiation (AD) in many different languages and systems
owl - Owl - OCaml Scientific Computing @ https://ocaml.xyz
autodidact - A pedagogical implementation of Autograd
SmallPebble - Minimal deep learning library written from scratch in Python, using NumPy/CuPy.
autograd - Efficiently computes derivatives of numpy code.
micrograd - A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API