Tullio.jl
TensorOperations.jl
Tullio.jl | TensorOperations.jl | |
---|---|---|
4 | 3 | |
583 | 413 | |
- | - | |
5.2 | 8.5 | |
5 months ago | 12 days ago | |
Julia | Julia | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Tullio.jl
- A basic introduction to NumPy's einsum
- Generic GPU Kernels
-
Julia: Faster than Fortran, cleaner than Numpy
Julia ships with OpenBLAS, in some cases there are pure-Julia "blas-like" routine that can be as fast:
https://github.com/mcabbott/Tullio.jl
TensorOperations.jl
- Einsum in 40 Lines of Python
-
Absolutely suck at tech stuff,but Julia makes me want to learn coding. Wish me luck.
Sometimes broadcasting feels like magic to me. It just works more often than not even when I am confused with the dimensions. If you do a lot of Tensor stuff it's also worth checking out Einstein notation (https://github.com/Jutho/TensorOperations.jl)
-
Programming Languages where element-wise matrix notation is possible
There are some libraries and macros for Einstein notation and related ideas, like TensorOperations.jl in Julia, einsum in numpy which someone already mentioned, and some small-scale/research languages like Diderot and Egison. In the mainstream, I guess languages generally use for loops or list comprehensions and try to recover vectorisation from that after the fact, but don’t guarantee it. Those that do make guarantees tend to use combinators that are matrixwise/function-level. I admit I pretty much categorically prefer the latter so I’m not as aware of the state of this as I’d like to be able to help.
What are some alternatives?
Zygote.jl - 21st century AD
Halide - a language for fast, portable data-parallel computation
CUDA.jl - CUDA programming in Julia.
NDTensors.jl - A Julia package for n-dimensional sparse tensors.
ForwardDiff.jl - Forward Mode Automatic Differentiation for Julia
Grassmann.jl - ⟨Grassmann-Clifford-Hodge⟩ multilinear differential geometric algebra
JuliaInterpreter.jl - Interpreter for Julia code
TensorFlock - A small functional tensor language with Einstein summation notation convention and shape-checking at compile-time.
futhark - :boom::computer::boom: A data-parallel functional programming language
ThinkJuliaFR.jl - Introduction à la programmation en Julia (livre)
DaemonMode.jl - Client-Daemon workflow to run faster scripts in Julia
TensorComprehensions - A domain specific language to express machine learning workloads.