miniF2F
einops
miniF2F  einops  

4  19  
258  7,971  
3.5%    
0.0  7.4  
9 months ago  9 days ago  
ObjectiveC++  Python  
  MIT License 
Stars  the number of stars that a project has on GitHub. Growth  month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
miniF2F

[D] Have their been any attempts to create a programming language specifically for machine learning?
That said, you *can* write down a desired type and have a system write down a ton of type annotations or generate a bunch of code to prove that the type you wrote down is satisfied by your program. There's been recent work on this in deep learning for theorem proving, such as this work which uses GPT for proving theorems in Lean, a dependently type programming language and theorem prover. A better approach though would be to combine this with an actual tree search algorithm to allow a more structured search over the space of proofs, instead of trying to generate full correct proofs in one shot. Hypertree Proof Search does this, using a variant of AlphaZero to search and finetune the neural net. Unfortunately it hasn't been opensourced though, and it's pretty compute intensive, so we can't use this for actual type inference yet. But yeah there's active interest in doing this kind of thing, both as a proving ground for using RL for reasoning tasks and from mathematicians for theoremproving.
 [D] First Author Interview: AI & formal math (Formal Mathematics Statement Curriculum Learning)
 [D] OpenAI tackles Math  Formal Mathematics Statement Curriculum Learning (Paper Explained Video)
 MiniF2F
einops

Einsum in 40 Lines of Python
Not sure if the wrapper you’re talking about is your own custom code, but I really like using einops lately. It’s got similar axis naming capabilities and it dispatches to both numpy and pytorch
http://einops.rocks/
 Einops: Flexible and powerful tensor operations for readable and reliable code

Yorick is an interpreted programming language for scientific simulations
Thanks for the pointer. I can believe that a language that looks so different will find that different patterns and primitives are natural for it.
My experience from writing a lot of arraybased code in NumPy/Matlab is that broadcasting absolutely has made it easier to write my code in those ecosystems. Axes of length 1 have often been in the right places already, or have been easy to insert. It's of course possible to create a big mess in any language; it seems likely that the NumPy code you saw could have been neater too.
In machine learning there can be many array dimensions floating around: batchdims, sequence and/or channeldims, weight matrices, and so on. It can be necessary to expand two or more dimensions, and/or line up dimensions quite carefully. Einops[1] has emerged from that community as a tool to succinctly express many operations that involve lots of array dimensions. You're likely to bump into more and more people who've used it, and again it seems there's some overlap with what Rank does. (And again, you'll see uses of Einops in the wild that are unnecessarily convoluted.)
[1] https://einops.rocks/  It works with all of the existing major arraybased frameworks for Python (NumPy/PyTorch/Jax/etc), and the emerging array API standard for Python.

Torch qeuivalent to image_to_array (keras)
this is definitely what you're looking for: https://github.com/arogozhnikov/einops

[D] Have their been any attempts to create a programming language specifically for machine learning?
Einops all the things! https://einops.rocks/
 DelimiterFirst Code

[D] Any independent researchers ever get published/into conferences?
It depends on what are their main purposes. I know some figures who have done an amazing job in this field but never because of publications, e.g. https://github.com/lucidrains and https://github.com/rwightman, https://einops.rocks/

[D] Anyone using named tensors or a tensor annotation lib productively?
On tsalib's warp: this is very similar to einops. I think it might even be slightly more general. However, I'm honestly not sure to what extent tsalib is still maintained, as it looks like the most recent commit was about two years ago. (Which is a shame.)

A basic introduction to NumPy's einsum
Also see Einops: https://github.com/arogozhnikov/einops, which uses a einsumlike notation for various tensor operations used in deep learning.
https://einops.rocks/pytorchexamples.html shows how it can be used to implement various neural network architectures in a more simplified manor.

Ask HN: What technologies greatly improve the efficiency of development?
This combined with something like einops [1] ( intuitive reshaping library) can be a huge time saver.
[1] https://github.com/arogozhnikov/einops
What are some alternatives?
tensor_annotations  Annotating tensor shapes using Python types
extendingjax  Extending JAX with custom C++ and CUDA code
torchtyping  Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.
opt_einsum  ⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
FL  FL language specification and reference implementations
kymatio  Wavelet scattering transforms in Python with GPU acceleration
dexlang  Research language for array processing in the Haskell/ML family
d2len  Interactive deep learning book with multiframework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
jaxtyping  Type annotations and runtime checking for shape and dtype of JAX/NumPy/PyTorch/etc. arrays. https://docs.kidger.site/jaxtyping/
datascienceipythonnotebooks  Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikitlearn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines.
hasktorch  Tensors and neural networks in Haskell
horovod  Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.