MyGrad
Drop-in autodiff for NumPy. (by rsokl)
SmallPebble
Minimal deep learning library written from scratch in Python, using NumPy/CuPy. (by sradc)
Our great sponsors
MyGrad | SmallPebble | |
---|---|---|
1 | 6 | |
186 | 112 | |
- | - | |
3.0 | 0.0 | |
25 days ago | over 1 year ago | |
Python | Python | |
MIT License | Apache License 2.0 |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
MyGrad
Posts with mentions or reviews of MyGrad.
We have used some of these posts to build our list of alternatives
and similar projects.
SmallPebble
Posts with mentions or reviews of SmallPebble.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-08-24.
-
Fastest Autograd in the West
You can implement autograd as a library. Just take a look at this
https://github.com/sradc/SmallPebble
The first line of the description is:
> SmallPebble is a minimal automatic differentiation and deep learning library written from scratch in Python, using NumPy/CuPy.
-
Compiling ML models to C for fun
Thanks for this. My approach to speeding up an autodiff system like this was to write it in terms of nd-arrays rather than scalars, using numpy/cupy [1]. But it's still slower than deep learning frameworks that compile / fuse operations. Wondering how it compares to the approach in this post. (Might try to benchmark at some point.)
[1] https://github.com/sradc/SmallPebble
- Understanding Automatic Differentiation in 30 lines of Python
-
[P] SmallPebble - minimal(/toy) deep learning framework written from scratch in Python, using NumPy/CuPy. <700 loc.
Located here: https://github.com/sradc/SmallPebble
- Show HN: I wrote a minimal(/toy) deep learning library from scratch in Python
- SmallPebble – Minimal automatic differentiation implementation in Python, NumPy
What are some alternatives?
When comparing MyGrad and SmallPebble you can also consider the following projects:
pytorch_sparse - PyTorch Extension Library of Optimized Autograd Sparse Matrix Operations
chainer - A flexible framework of neural networks for deep learning
memoized_coduals - Shows that it is possible to implement reverse mode autodiff using a variation on the dual numbers called the codual numbers
Tensor-Puzzles - Solve puzzles. Improve your pytorch.
GPU-Puzzles - Solve puzzles. Learn CUDA.