owl
SmallPebble
owl | SmallPebble | |
---|---|---|
5 | 6 | |
1,179 | 112 | |
0.8% | - | |
8.2 | 0.0 | |
4 days ago | over 1 year ago | |
OCaml | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
owl
- Owl project (OCaml scientific computing) formally concluded
- Understanding Automatic Differentiation in 30 lines of Python
- I Wrote an Activitypub Server in OCaml: Lessons Learnt, Weekends Lost
-
Julia 1.6 addresses latency issues
> after some consideration of OCaml, but unfortunately the multi-core story still isn't there yet
It is supposed to land in the release after 4.13, which is the next one.
Regarding the scientific computations library there is Owl[1][2] which now has an almost finished book[3].
[1] https://ocaml.xyz/
[2] https://github.com/owlbarn/owl
[3] https://ocaml.xyz/book/
-
A Comparison of Futhark and Dex
The Owl lib for OCaml is pretty interesting
https://github.com/owlbarn/owl
SmallPebble
-
Fastest Autograd in the West
You can implement autograd as a library. Just take a look at this
https://github.com/sradc/SmallPebble
The first line of the description is:
> SmallPebble is a minimal automatic differentiation and deep learning library written from scratch in Python, using NumPy/CuPy.
-
Compiling ML models to C for fun
Thanks for this. My approach to speeding up an autodiff system like this was to write it in terms of nd-arrays rather than scalars, using numpy/cupy [1]. But it's still slower than deep learning frameworks that compile / fuse operations. Wondering how it compares to the approach in this post. (Might try to benchmark at some point.)
[1] https://github.com/sradc/SmallPebble
- Understanding Automatic Differentiation in 30 lines of Python
-
[P] SmallPebble - minimal(/toy) deep learning framework written from scratch in Python, using NumPy/CuPy. <700 loc.
Located here: https://github.com/sradc/SmallPebble
- Show HN: I wrote a minimal(/toy) deep learning library from scratch in Python
- SmallPebble – Minimal automatic differentiation implementation in Python, NumPy
What are some alternatives?
Octavian.jl - Multi-threaded BLAS-like library that provides pure Julia matrix multiplication
MyGrad - Drop-in autodiff for NumPy.
Arraymancer - A fast, ergonomic and portable tensor library in Nim with a deep learning focus for CPU, GPU and embedded devices via OpenMP, Cuda and OpenCL backends
chainer - A flexible framework of neural networks for deep learning
Peroxide - Rust numeric library with R, MATLAB & Python syntax
memoized_coduals - Shows that it is possible to implement reverse mode autodiff using a variation on the dual numbers called the codual numbers
symengine.rs - (Unofficial) Rust wrappers to the C++ library SymEngine, a fast C++ symbolic manipulation library.
Tensor-Puzzles - Solve puzzles. Improve your pytorch.
db-benchmark - reproducible benchmark of database-like ops
GPU-Puzzles - Solve puzzles. Learn CUDA.
micrograd - A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API