autograd VS equinox

Compare autograd vs equinox and see what are their differences.

autograd

Efficiently computes derivatives of numpy code. (by HIPS)

equinox

Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/ (by patrick-kidger)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
autograd equinox
6 31
6,797 1,819
0.7% -
6.0 9.2
7 days ago 14 days ago
Python Python
MIT License Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

autograd

Posts with mentions or reviews of autograd. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-09-28.
  • JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
    12 projects | news.ycombinator.com | 28 Sep 2023
    Actually, that's never been a constraint for JAX autodiff. JAX grew out of the original Autograd (https://github.com/hips/autograd), so differentiating through Python control flow always worked. It's jax.jit and jax.vmap which place constraints on control flow, requiring structured control flow combinators like those.
  • Autodidax: Jax Core from Scratch (In Python)
    4 projects | news.ycombinator.com | 11 Feb 2023
    I'm sure there's a lot of good material around, but here are some links that are conceptually very close to the linked Autodidax.

    There's [Autodidact](https://github.com/mattjj/autodidact), a predecessor to Autodidax, which was a simplified implementation of [the original Autograd](https://github.com/hips/autograd). It focuses on reverse-mode autodiff, not building an open-ended transformation system like Autodidax. It's also pretty close to the content in [these lecture slides](https://www.cs.toronto.edu/~rgrosse/courses/csc321_2018/slid...) and [this talk](http://videolectures.net/deeplearning2017_johnson_automatic_...). But the autodiff in Autodidax is more sophisticated and reflects clearer thinking. In particular, Autodidax shows how to implement forward- and reverse-modes using only one set of linearization rules (like in [this paper](https://arxiv.org/abs/2204.10923)).

    Here's [an even smaller and more recent variant](https://gist.github.com/mattjj/52914908ac22d9ad57b76b685d19a...), a single ~100 line file for reverse-mode AD on top of NumPy, which was live-coded during a lecture. There's no explanatory material to go with it though.

  • Numba: A High Performance Python Compiler
    11 projects | news.ycombinator.com | 27 Dec 2022
    XLA is "higher level" than what Numba produces.

    You may be able to get the equivalent of jax via numba+numpy+autograd[1], but I haven't tried it before.

    IMHO, jax is best thought of as a numerical computation library that happens to include autograd, vmapping, pmapping and provides a high level interface for XLA.

    I have built a numerical optimisation library with it, and although a few things became verbose, it was a rather pleasant experience as the natural vmapping made everything a breeze, I didn't have to write the gradients for my testing functions, except for special cases that involved exponents and logs that needed a bit of delicate care.

    [1] https://github.com/HIPS/autograd

  • Run Your Own DALL·E Mini (Craiyon) Server on EC2
    16 projects | dev.to | 26 Jul 2022
    Next, we want the code in the https://github.com/hrichardlee/dalle-playground repo, and we want to construct a pip environment from the backend/requirements.txt file in that repo. We were almost able to use the saharmor/dalle-playground repo as-is, but we had to make one change to add the jax[cuda] package to the requirements.txt file. In case you haven’t seen jax before, jax is a machine-learning library from Google, roughly equivalent to Tensorflow or PyTorch. It combines Autograd for automatic differentiation and XLA (accelerated linear algebra) for JIT-compiling numpy-like code for Google’s TPUs or Nvidia’s CUDA API for GPUs. The CUDA support requires explicitly selecting the [cuda] option when we install the package.
  • Trade-Offs in Automatic Differentiation: TensorFlow, PyTorch, Jax, and Julia
    7 projects | news.ycombinator.com | 25 Dec 2021
    > fun fact, the Jax folks at Google Brain did have a Python source code transform AD at one point but it was scrapped essentially because of these difficulties

    I assume you mean autograd?

    https://github.com/HIPS/autograd

  • JAX - COMPARING WITH THE BIG ONES
    2 projects | /r/CryptocurrencyICO | 6 Sep 2021
    These four points lead to an enormous differentiation in the ecosystem: Keras, for example, was originally thought to be almost completely focused on point (4), leaving the other tasks to a backend engine. In 2015, on the other hand, Autograd focused on the first two points, allowing users to write code using only "classic" Python and NumPy constructs, providing subsequently many options for point (2). Autograd's simplicity greatly influenced the development of the libraries to follow, but it was penalized by the clear lack of the points (3) and (4), i.e. adequate techniques to speed up the code and sufficiently abstract modules for neural network development.

equinox

Posts with mentions or reviews of equinox. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-03.
  • Ask HN: What side projects landed you a job?
    62 projects | news.ycombinator.com | 3 Dec 2023
    I wrote a JAX-based neural network library (Equinox [1]) and numerical differential equation solving library (Diffrax [2]).

    At the time I was just exploring some new research ideas in numerics -- and frankly, procrastinating from writing up my PhD thesis!

    But then one of the teams at Google starting using them, so they offered me a job to keep developing them for their needs. Plus I'd get to work in biotech, which was a big interest of mine. This was a clear dream job offer, so I accepted.

    Since then both have grown steadily in popularity (~2.6k GitHub stars) and now see pretty widespread use! I've since started writing several other JAX libraries and we now have a bit of an ecosystem going.

    [1] https://github.com/patrick-kidger/equinox

  • [P] Optimistix, nonlinear optimisation in JAX+Equinox!
    3 projects | /r/MachineLearning | 14 Oct 2023
    The elevator pitch is Optimistix is really fast, especially to compile. It plays nicely with Optax for first-order gradient-based methods, and takes a lot of design inspiration from Equinox, representing the state of all the solvers as standard JAX PyTrees.
  • JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
    12 projects | news.ycombinator.com | 28 Sep 2023
    If you like PyTorch then you might like Equinox, by the way. (https://github.com/patrick-kidger/equinox ; 1.4k GitHub stars now!)
  • Equinox: Elegant easy-to-use neural networks in Jax
    1 project | news.ycombinator.com | 18 Sep 2023
  • Show HN: Equinox (1.3k stars), a JAX library for neural networks and sciML
    1 project | news.ycombinator.com | 5 Sep 2023
  • Pytrees
    2 projects | news.ycombinator.com | 22 May 2023
    You're thinking of `jax.closure_convert`. :)

    (Although technically that works by tracing and extracting all constants from the jaxpr, rather than introspecting the function's closure cells -- it sounds like your trick is the latter.)

    When you discuss dynamic allocation, I'm guessing you're mainly referring to not being able to backprop through `jax.lax.while_loop`. If so, you might find `equinox.internal.while_loop` interesting, which is an unbounded while loop that you can backprop through! The secret sauce is to use a treeverse-style checkpointing scheme.

    https://github.com/patrick-kidger/equinox/blob/f95a8ba13fb35...

  • Writing Python like it’s Rust
    4 projects | /r/rust | 20 May 2023
    I'm a big fan of using ABCs to declare interfaces -- so much so that I have an improved abc.ABCMeta that also handles abstract instance variables and abstract class variables: https://github.com/patrick-kidger/equinox/blob/main/equinox/_better_abstract.py
  • [D] JAX vs PyTorch in 2023
    5 projects | /r/MachineLearning | 9 Mar 2023
    For the daily research, I use Equinox (https://github.com/patrick-kidger/equinox) as a DL librarry in JAX.
  • [Machinelearning] [D] État actuel de JAX vs Pytorch?
    1 project | /r/enfrancais | 24 Feb 2023
  • Training Deep Networks with Data Parallelism in Jax
    6 projects | news.ycombinator.com | 24 Feb 2023
    It sounds like you're concerned about how downstream libraries tend to wrap JAX transformations to handle their own thing? (E.g. `haiku.grad`.)

    If so, then allow me to make my usual advert here for Equinox:

    https://github.com/patrick-kidger/equinox

    This actually works with JAX's native transformations. (There's no `equinox.vmap` for example.)

    On higher-order functions more generally, Equinox offers a way to control these quite carefully, by making ubiquitous use of callables that are also pytrees. E.g. a neural network is both a callable in that it has a forward pass, and a pytree in that it records its parameters in its tree structure.

What are some alternatives?

When comparing autograd and equinox you can also consider the following projects:

Enzyme - High-performance automatic differentiation of LLVM and MLIR.

flax - Flax is a neural network library for JAX that is designed for flexibility.

SwinIR - SwinIR: Image Restoration Using Swin Transformer (official repository)

dm-haiku - JAX-based neural network library

jaxonnxruntime - A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.

torchtyping - Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.

autodidact - A pedagogical implementation of Autograd

treex - A Pytree Module system for Deep Learning in JAX

fbpic - Spectral, quasi-3D Particle-In-Cell code, for CPU and GPU

extending-jax - Extending JAX with custom C++ and CUDA code

pure_numba_alias_sampling - Pure numba version of Alias sampling algorithm from L. Devroye's, "Non-Uniform Random Random Variate Generation"

diffrax - Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/