jaxtyping
torchtyping
jaxtyping | torchtyping | |
---|---|---|
8 | 7 | |
1,305 | 1,413 | |
5.1% | 0.5% | |
8.2 | 3.3 | |
1 day ago | 6 months ago | |
Python | Python | |
GNU General Public License v3.0 or later | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
jaxtyping
-
Python type hints may not be not for me in practice
You want runtime typechecking.
See either beartype [1] or typeguard [2]. And if you're doing any kind of array-based programming (JAX or not), then jaxtyping [3].
[1] https://github.com/beartype/beartype/
[2] https://github.com/agronholm/typeguard
[3] https://github.com/patrick-kidger/jaxtyping
-
Writing Python like it's Rust
Try using [jaxtyping](https://github.com/google/jaxtyping).
It also supports numpy/pytorch/etc.
-
Writing Python like it’s Rust
Since you mention ML use-cases, you might like jaxtyping.
-
Scientific computing in JAX
jaxtyping: rich shape & dtype annotations for arrays and tensors (also supports PyTorch/TensorFlow/NumPy);
-
[D] Have their been any attempts to create a programming language specifically for machine learning?
Heads-up that my newer jaxtyping project now exists.
-
Returning to snake's nest after a long journey, any major advances in python for science ?
As other folks have commented, type hints are now a big deal. For static typing the best checker is pyright. For runtime checking there is typeguard and beartype. These can be integrated with array libraries through jaxtyping. (Which also works for PyTorch/numpy/etc., despite the name.)
- Type annotations and runtime checking for shape and dtype
torchtyping
-
[D] Have their been any attempts to create a programming language specifically for machine learning?
Not really an answer to your question, but there are Python packages that try to solve the problem of tensor shapes that you mentioned, e.g. https://github.com/patrick-kidger/torchtyping or https://github.com/deepmind/tensor_annotations
-
What's New in Python 3.11?
I disagree. I've had a serious attempt at array typing using variadic generics and I'm not impressed. Python's type system has numerous issues... and now they just apply to any "ArrayWithNDimensions" type as well as any "ArrayWith2Dimenensions" type.
Variadic protocols don't exist; many operations like stacking are inexpressible; the synatx is awful and verbose; etc. etc.
I've written more about this here as part of my TorchTyping project: [0]
[0] https://github.com/patrick-kidger/torchtyping/issues/37#issu...
-
Can anyone point out the mistakes in my input layer or dimension?
also https://github.com/patrick-kidger/torchtyping
-
[D] Anyone using named tensors or a tensor annotation lib productively?
FWIW I'm the author of torchtyping so happy to answer any questions about that. :) I think people are using it!
-
[D] Ideal deep learning library
The one thing I really *really* wish got more attention was named tensors and the tensor type system. Tensor misalignment errors are a constant source of silently-failing bugs. While 3rd party libraries have attempted to fill this gap, it really needs better native support. In particular it seems like bad form to me for programmers to have to remember the specific alignment and broadcasting rules, and then have to apply them to an often poorly documented order of tensor indices. I'd really like to see something like tsalib's warp operator made part of the main library and generalized to arbitrary function application, like a named-tensor version of fold. But preferably using notation closer to that of torchtyping.
-
[P] torchtyping -- documentation + runtime type checking of tensor shapes (and dtypes, ...)
Yes it does work with numerical literals! It support using integers to specify an absolute size, strings to specify names for dimensions that should all be consistently sized (and optionally also checks named tensors), "..." to indicate batch dimensions, and so on. See the full list here.
What are some alternatives?
plum - Multiple dispatch in Python
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
pytype - A static type analyzer for Python code
functorch - functorch is JAX-like composable function transforms for PyTorch.
madtypes - Python Type that raise TypeError at runtime
tsalib - Tensor Shape Annotation Library (numpy, tensorflow, pytorch, ...)
tiny-cuda-nn - Lightning fast C++/CUDA neural network framework
tensor_annotations - Annotating tensor shapes using Python types
diffrax - Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/
miniF2F - Formal to Formal Mathematics Benchmark
MindsDB - AGI's query engine - Platform for building AI that can learn and answer questions over federated data.
einops - Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)