equinox VS torchtyping

Compare equinox vs torchtyping and see what are their differences.

equinox

Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/ (by patrick-kidger)

torchtyping

Type annotations and dynamic checking for a tensor's shape, dtype, names, etc. (by patrick-kidger)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
equinox torchtyping
31 7
1,819 1,337
- -
9.2 3.2
13 days ago 11 months ago
Python Python
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

equinox

Posts with mentions or reviews of equinox. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-03.
  • Ask HN: What side projects landed you a job?
    62 projects | news.ycombinator.com | 3 Dec 2023
    I wrote a JAX-based neural network library (Equinox [1]) and numerical differential equation solving library (Diffrax [2]).

    At the time I was just exploring some new research ideas in numerics -- and frankly, procrastinating from writing up my PhD thesis!

    But then one of the teams at Google starting using them, so they offered me a job to keep developing them for their needs. Plus I'd get to work in biotech, which was a big interest of mine. This was a clear dream job offer, so I accepted.

    Since then both have grown steadily in popularity (~2.6k GitHub stars) and now see pretty widespread use! I've since started writing several other JAX libraries and we now have a bit of an ecosystem going.

    [1] https://github.com/patrick-kidger/equinox

  • [P] Optimistix, nonlinear optimisation in JAX+Equinox!
    3 projects | /r/MachineLearning | 14 Oct 2023
    The elevator pitch is Optimistix is really fast, especially to compile. It plays nicely with Optax for first-order gradient-based methods, and takes a lot of design inspiration from Equinox, representing the state of all the solvers as standard JAX PyTrees.
  • JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
    12 projects | news.ycombinator.com | 28 Sep 2023
    If you like PyTorch then you might like Equinox, by the way. (https://github.com/patrick-kidger/equinox ; 1.4k GitHub stars now!)
  • Equinox: Elegant easy-to-use neural networks in Jax
    1 project | news.ycombinator.com | 18 Sep 2023
  • Show HN: Equinox (1.3k stars), a JAX library for neural networks and sciML
    1 project | news.ycombinator.com | 5 Sep 2023
  • Pytrees
    2 projects | news.ycombinator.com | 22 May 2023
    You're thinking of `jax.closure_convert`. :)

    (Although technically that works by tracing and extracting all constants from the jaxpr, rather than introspecting the function's closure cells -- it sounds like your trick is the latter.)

    When you discuss dynamic allocation, I'm guessing you're mainly referring to not being able to backprop through `jax.lax.while_loop`. If so, you might find `equinox.internal.while_loop` interesting, which is an unbounded while loop that you can backprop through! The secret sauce is to use a treeverse-style checkpointing scheme.

    https://github.com/patrick-kidger/equinox/blob/f95a8ba13fb35...

  • Writing Python like it’s Rust
    4 projects | /r/rust | 20 May 2023
    I'm a big fan of using ABCs to declare interfaces -- so much so that I have an improved abc.ABCMeta that also handles abstract instance variables and abstract class variables: https://github.com/patrick-kidger/equinox/blob/main/equinox/_better_abstract.py
  • [D] JAX vs PyTorch in 2023
    5 projects | /r/MachineLearning | 9 Mar 2023
    For the daily research, I use Equinox (https://github.com/patrick-kidger/equinox) as a DL librarry in JAX.
  • [Machinelearning] [D] État actuel de JAX vs Pytorch?
    1 project | /r/enfrancais | 24 Feb 2023
  • Training Deep Networks with Data Parallelism in Jax
    6 projects | news.ycombinator.com | 24 Feb 2023
    It sounds like you're concerned about how downstream libraries tend to wrap JAX transformations to handle their own thing? (E.g. `haiku.grad`.)

    If so, then allow me to make my usual advert here for Equinox:

    https://github.com/patrick-kidger/equinox

    This actually works with JAX's native transformations. (There's no `equinox.vmap` for example.)

    On higher-order functions more generally, Equinox offers a way to control these quite carefully, by making ubiquitous use of callables that are also pytrees. E.g. a neural network is both a callable in that it has a forward pass, and a pytree in that it records its parameters in its tree structure.

torchtyping

Posts with mentions or reviews of torchtyping. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-02-11.
  • [D] Have their been any attempts to create a programming language specifically for machine learning?
    12 projects | /r/MachineLearning | 11 Feb 2023
    Not really an answer to your question, but there are Python packages that try to solve the problem of tensor shapes that you mentioned, e.g. https://github.com/patrick-kidger/torchtyping or https://github.com/deepmind/tensor_annotations
  • What's New in Python 3.11?
    14 projects | news.ycombinator.com | 26 Jun 2022
    I disagree. I've had a serious attempt at array typing using variadic generics and I'm not impressed. Python's type system has numerous issues... and now they just apply to any "ArrayWithNDimensions" type as well as any "ArrayWith2Dimenensions" type.

    Variadic protocols don't exist; many operations like stacking are inexpressible; the synatx is awful and verbose; etc. etc.

    I've written more about this here as part of my TorchTyping project: [0]

    [0] https://github.com/patrick-kidger/torchtyping/issues/37#issu...

  • Can anyone point out the mistakes in my input layer or dimension?
    1 project | /r/learnmachinelearning | 6 Jun 2022
    also https://github.com/patrick-kidger/torchtyping
  • [D] Anyone using named tensors or a tensor annotation lib productively?
    2 projects | /r/MachineLearning | 18 Apr 2022
    FWIW I'm the author of torchtyping so happy to answer any questions about that. :) I think people are using it!
  • [D] Ideal deep learning library
    9 projects | /r/MachineLearning | 5 Jan 2022
    The one thing I really *really* wish got more attention was named tensors and the tensor type system. Tensor misalignment errors are a constant source of silently-failing bugs. While 3rd party libraries have attempted to fill this gap, it really needs better native support. In particular it seems like bad form to me for programmers to have to remember the specific alignment and broadcasting rules, and then have to apply them to an often poorly documented order of tensor indices. I'd really like to see something like tsalib's warp operator made part of the main library and generalized to arbitrary function application, like a named-tensor version of fold. But preferably using notation closer to that of torchtyping.
  • [P] torchtyping -- documentation + runtime type checking of tensor shapes (and dtypes, ...)
    2 projects | /r/MachineLearning | 7 Apr 2021
    Yes it does work with numerical literals! It support using integers to specify an absolute size, strings to specify names for dimensions that should all be consistently sized (and optionally also checks named tensors), "..." to indicate batch dimensions, and so on. See the full list here.

What are some alternatives?

When comparing equinox and torchtyping you can also consider the following projects:

flax - Flax is a neural network library for JAX that is designed for flexibility.

jaxtyping - Type annotations and runtime checking for shape and dtype of JAX/NumPy/PyTorch/etc. arrays. https://docs.kidger.site/jaxtyping/

dm-haiku - JAX-based neural network library

tsalib - Tensor Shape Annotation Library (numpy, tensorflow, pytorch, ...)

treex - A Pytree Module system for Deep Learning in JAX

mypy - Optional static typing for Python

extending-jax - Extending JAX with custom C++ and CUDA code

functorch - functorch is JAX-like composable function transforms for PyTorch.

diffrax - Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/

tensor_annotations - Annotating tensor shapes using Python types

elegy - A High Level API for Deep Learning in JAX

Roslyn - The Roslyn .NET compiler provides C# and Visual Basic languages with rich code analysis APIs.