
einops
Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
Also see Einops: https://github.com/arogozhnikov/einops, which uses a einsumlike notation for various tensor operations used in deep learning.
https://einops.rocks/pytorchexamples.html shows how it can be used to implement various neural network architectures in a more simplified manor.

Scout Monitoring
Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.

If you are looking for something like this in C++, here's my attempt at implementing it: https://github.com/dsharlet/array#einsteinreductions
It doesn't do any automatic optimization of the loops like some of the projects linked in this thread, but, it provides all the tools needed for humans to express the code in a way that a good compiler can turn it into really good code.

If I understand https://github.com/numpy/numpy/blob/v1.22.0/numpy/core/einsu... and https://github.com/numpy/numpy/blob/v1.22.0/numpy/core/src/m... correctly, using einsum without the optimize flag seems to use a for loop in C to do the multiplication.
The optimizer clearly tries to improve the performance, but in many cases, it doesn't seem to change anything. Let's simply multiply some matrices:


alphafold2
To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
* Example: https://github.com/lucidrains/alphafold2/blob/d59cb1ea536bc5...


InfluxDB
Power RealTime Data Analytics at Scale. Get realtime insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in realtime with unbounded cardinality.

Somebody also realized that much of the time you can use one single function to describe all 3 of the einops operations. I present to you, einop: https://github.com/cgarciae/einop

Einops looks nice! It reminds me of https://github.com/deepmind/einshape which is another attempt at unifying reshape, squeeze, expand_dims, transpose, tile, flatten, etc under an einsuminspired DSL.

If you're into tensor algebra i can only recommend the beautiful piece of Software Cadabra is:
https://cadabra.science/
We wrote an article with it once, 40th order in the Lagrangian, perhaps 50k pages of calculations when all printed. Amazing tool! Thanks Kasper!
Related posts

Einsum in 40 Lines of Python

Dot vs Matrix vs Elementwise multiplication in PyTorch

Elementwise vs Matrix vs Dot multiplication

Why do all the popular projects use relative imports in __init__ files if PEP 8 recommends absolute?

Sematic + Ray: The Best of Orchestration and Distributed Compute at your Fingertips