einops
torchtyping
Our great sponsors
einops | torchtyping | |
---|---|---|
17 | 7 | |
7,809 | 1,328 | |
- | - | |
8.2 | 3.2 | |
about 2 months ago | 9 months ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
einops
-
Yorick is an interpreted programming language for scientific simulations
Thanks for the pointer. I can believe that a language that looks so different will find that different patterns and primitives are natural for it.
My experience from writing a lot of array-based code in NumPy/Matlab is that broadcasting absolutely has made it easier to write my code in those ecosystems. Axes of length 1 have often been in the right places already, or have been easy to insert. It's of course possible to create a big mess in any language; it seems likely that the NumPy code you saw could have been neater too.
In machine learning there can be many array dimensions floating around: batch-dims, sequence and/or channel-dims, weight matrices, and so on. It can be necessary to expand two or more dimensions, and/or line up dimensions quite carefully. Einops[1] has emerged from that community as a tool to succinctly express many operations that involve lots of array dimensions. You're likely to bump into more and more people who've used it, and again it seems there's some overlap with what Rank does. (And again, you'll see uses of Einops in the wild that are unnecessarily convoluted.)
[1] https://einops.rocks/ -- It works with all of the existing major array-based frameworks for Python (NumPy/PyTorch/Jax/etc), and the emerging array API standard for Python.
-
[D] Have their been any attempts to create a programming language specifically for machine learning?
Einops all the things! https://einops.rocks/
- Delimiter-First Code
-
[D] Any independent researchers ever get published/into conferences?
It depends on what are their main purposes. I know some figures who have done an amazing job in this field but never because of publications, e.g. https://github.com/lucidrains and https://github.com/rwightman, https://einops.rocks/
-
[D] Anyone using named tensors or a tensor annotation lib productively?
On tsalib's warp: this is very similar to einops. I think it might even be slightly more general. However, I'm honestly not sure to what extent tsalib is still maintained, as it looks like the most recent commit was about two years ago. (Which is a shame.)
-
A basic introduction to NumPy's einsum
Also see Einops: https://github.com/arogozhnikov/einops, which uses a einsum-like notation for various tensor operations used in deep learning.
https://einops.rocks/pytorch-examples.html shows how it can be used to implement various neural network architectures in a more simplified manor.
I would like to second that! I converted multiple of my lab mates into Einops just by having them browse through the tutorial in https://einops.rocks/ :)
-
Ask HN: What technologies greatly improve the efficiency of development?
This combined with something like einops [1] ( intuitive reshaping library) can be a huge time saver.
-
[D] What are your favorite tools to visualize/explain tensor operations?
einops: just look at the pretty visual GIF and be amazed
torchtyping
-
[D] Have their been any attempts to create a programming language specifically for machine learning?
Not really an answer to your question, but there are Python packages that try to solve the problem of tensor shapes that you mentioned, e.g. https://github.com/patrick-kidger/torchtyping or https://github.com/deepmind/tensor_annotations
-
What's New in Python 3.11?
I disagree. I've had a serious attempt at array typing using variadic generics and I'm not impressed. Python's type system has numerous issues... and now they just apply to any "ArrayWithNDimensions" type as well as any "ArrayWith2Dimenensions" type.
Variadic protocols don't exist; many operations like stacking are inexpressible; the synatx is awful and verbose; etc. etc.
I've written more about this here as part of my TorchTyping project: [0]
[0] https://github.com/patrick-kidger/torchtyping/issues/37#issu...
-
[D] Anyone using named tensors or a tensor annotation lib productively?
FWIW I'm the author of torchtyping so happy to answer any questions about that. :) I think people are using it!
-
[D] Ideal deep learning library
The one thing I really *really* wish got more attention was named tensors and the tensor type system. Tensor misalignment errors are a constant source of silently-failing bugs. While 3rd party libraries have attempted to fill this gap, it really needs better native support. In particular it seems like bad form to me for programmers to have to remember the specific alignment and broadcasting rules, and then have to apply them to an often poorly documented order of tensor indices. I'd really like to see something like tsalib's warp operator made part of the main library and generalized to arbitrary function application, like a named-tensor version of fold. But preferably using notation closer to that of torchtyping.
-
[P] torchtyping -- documentation + runtime type checking of tensor shapes (and dtypes, ...)
Hello everyone. I'm excited to announce torchtyping, as a way to document -- and check -- that PyTorch tensors have the correct shape (dtype, names, layout, ...).
Yes it does work with numerical literals! It support using integers to specify an absolute size, strings to specify names for dimensions that should all be consistently sized (and optionally also checks named tensors), "..." to indicate batch dimensions, and so on. See the full list here.
What are some alternatives?
extending-jax - Extending JAX with custom C++ and CUDA code
opt_einsum - ⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
jaxtyping - Type annotations and runtime checking for shape and dtype of JAX/NumPy/PyTorch/etc. arrays. https://docs.kidger.site/jaxtyping/
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
kymatio - Wavelet scattering transforms in Python with GPU acceleration
tsalib - Tensor Shape Annotation Library (numpy, tensorflow, pytorch, ...)
data-science-ipython-notebooks - Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines.
horovod - Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
d2l-en - Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
3d-ken-burns - an implementation of 3D Ken Burns Effect from a Single Image using PyTorch
jaxopt - Hardware accelerated, batchable and differentiable optimizers in JAX.
best-of-ml-python - 🏆 A ranked list of awesome machine learning Python libraries. Updated weekly.