diffrax VS equinox

Compare diffrax vs equinox and see what are their differences.

diffrax

Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/ (by patrick-kidger)

equinox

Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/ (by patrick-kidger)
Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
diffrax equinox
21 31
1,230 1,809
- -
8.3 9.2
4 days ago 8 days ago
Python Python
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

diffrax

Posts with mentions or reviews of diffrax. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-03.
  • Ask HN: What side projects landed you a job?
    62 projects | news.ycombinator.com | 3 Dec 2023
  • [P] Optimistix, nonlinear optimisation in JAX+Equinox!
    3 projects | /r/MachineLearning | 14 Oct 2023
    Optimistix has high-level APIs for minimisation, least-squares, root-finding, and fixed-point iteration and was written to take care of these kinds of subroutines in Diffrax.
  • Show HN: Optimistix: Nonlinear Optimisation in Jax+Equinox
    2 projects | news.ycombinator.com | 10 Oct 2023
    Diffrax (https://github.com/patrick-kidger/diffrax).

    Here is the GitHub: https://github.com/patrick-kidger/optimistix

    The elevator pitch is Optimistix is really fast, especially to compile. It

  • Scientific computing in JAX
    4 projects | /r/ScientificComputing | 4 Apr 2023
    Sure. So I've got some PyTorch benchmarks here. The main take-away so far has been that for a neural ODE, the backward pass takes about 50% longer in PyTorch, and the forward (inference) pass takes an incredible 100x longer.
  • [D] JAX vs PyTorch in 2023
    5 projects | /r/MachineLearning | 9 Mar 2023
    FWIW this worked for me. :D My full-time job is now writing JAX libraries at Google. Equinox for neural networks, Diffrax for differential equation solvers, etc.
  • Returning to snake's nest after a long journey, any major advances in python for science ?
    7 projects | /r/Python | 24 Jan 2023
    It's relatively early days yet, but JAX is in the process of developing its nascent scientific computing / scientific machine learning ecosystem. Mostly because of its strong autodifferentiation capabilities, excellent JIT compiler etc. (E.g. to show off one of my own projects, Diffrax is the library of diffeq solvers for JAX.)
  • What's the best thing/library you learned this year ?
    12 projects | /r/Python | 16 Dec 2022
    Diffrax - solving ODEs with Jax and computing it's derivatives automatically functools - love partial and lru_cache fastprogress - simpler progress bar than tqdm
  • PyTorch 2.0
    4 projects | news.ycombinator.com | 2 Dec 2022
    At least prior to this announcement: JAX was much faster than PyTorch for differentiable physics. (Better JIT compiler; reduced Python-level overhead.)

    E.g for numerical ODE simulation, I've found that Diffrax (https://github.com/patrick-kidger/diffrax) is ~100 times faster than torchdiffeq on the forward pass. The backward pass is much closer, and for this Diffrax is about 1.5 times faster.

    It remains to be seen how PyTorch 2.0 will compare, or course!

    Right now my job is actually building out the scientific computing ecosystem in JAX, so feel free to ping me with any other questions.

  • Python 3.11 is much faster than 3.8
    11 projects | news.ycombinator.com | 26 Oct 2022
    https://github.com/patrick-kidger/diffrax

    Which are neural network and differential equation libraries for JAX.

    [Obligatory I-am-googler-my-opinions-do-not-represent- your-employer...]

  • Ask HN: What's your favorite programmer niche?
    8 projects | news.ycombinator.com | 15 Oct 2022
    Autodifferentiable programming!

    Neural networks are the famous example of this, of course -- but this can be extended to all of scientific computing. ODE/SDE solvers, root-finding algorithms, LQP, molecular dynamics, ...

    These days I'm doing all my work in JAX. (E.g. see Equinox or Diffrax: https://github.com/patrick-kidger/equinox, https://github.com/patrick-kidger/diffrax). A lot of modern work is now based around hybridising such techniques with neural networks.

    I'd really encourage anyone interested to learn how JAX works under-the-hood as well. (Look up "autodidax") Lots of clever/novel ideas in its design.

equinox

Posts with mentions or reviews of equinox. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-03.
  • Ask HN: What side projects landed you a job?
    62 projects | news.ycombinator.com | 3 Dec 2023
    I wrote a JAX-based neural network library (Equinox [1]) and numerical differential equation solving library (Diffrax [2]).

    At the time I was just exploring some new research ideas in numerics -- and frankly, procrastinating from writing up my PhD thesis!

    But then one of the teams at Google starting using them, so they offered me a job to keep developing them for their needs. Plus I'd get to work in biotech, which was a big interest of mine. This was a clear dream job offer, so I accepted.

    Since then both have grown steadily in popularity (~2.6k GitHub stars) and now see pretty widespread use! I've since started writing several other JAX libraries and we now have a bit of an ecosystem going.

    [1] https://github.com/patrick-kidger/equinox

  • [P] Optimistix, nonlinear optimisation in JAX+Equinox!
    3 projects | /r/MachineLearning | 14 Oct 2023
    The elevator pitch is Optimistix is really fast, especially to compile. It plays nicely with Optax for first-order gradient-based methods, and takes a lot of design inspiration from Equinox, representing the state of all the solvers as standard JAX PyTrees.
  • JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
    12 projects | news.ycombinator.com | 28 Sep 2023
    If you like PyTorch then you might like Equinox, by the way. (https://github.com/patrick-kidger/equinox ; 1.4k GitHub stars now!)
  • Equinox: Elegant easy-to-use neural networks in Jax
    1 project | news.ycombinator.com | 18 Sep 2023
  • Show HN: Equinox (1.3k stars), a JAX library for neural networks and sciML
    1 project | news.ycombinator.com | 5 Sep 2023
  • Pytrees
    2 projects | news.ycombinator.com | 22 May 2023
    You're thinking of `jax.closure_convert`. :)

    (Although technically that works by tracing and extracting all constants from the jaxpr, rather than introspecting the function's closure cells -- it sounds like your trick is the latter.)

    When you discuss dynamic allocation, I'm guessing you're mainly referring to not being able to backprop through `jax.lax.while_loop`. If so, you might find `equinox.internal.while_loop` interesting, which is an unbounded while loop that you can backprop through! The secret sauce is to use a treeverse-style checkpointing scheme.

    https://github.com/patrick-kidger/equinox/blob/f95a8ba13fb35...

  • Writing Python like it’s Rust
    4 projects | /r/rust | 20 May 2023
    I'm a big fan of using ABCs to declare interfaces -- so much so that I have an improved abc.ABCMeta that also handles abstract instance variables and abstract class variables: https://github.com/patrick-kidger/equinox/blob/main/equinox/_better_abstract.py
  • [D] JAX vs PyTorch in 2023
    5 projects | /r/MachineLearning | 9 Mar 2023
    For the daily research, I use Equinox (https://github.com/patrick-kidger/equinox) as a DL librarry in JAX.
  • [Machinelearning] [D] État actuel de JAX vs Pytorch?
    1 project | /r/enfrancais | 24 Feb 2023
  • Training Deep Networks with Data Parallelism in Jax
    6 projects | news.ycombinator.com | 24 Feb 2023
    It sounds like you're concerned about how downstream libraries tend to wrap JAX transformations to handle their own thing? (E.g. `haiku.grad`.)

    If so, then allow me to make my usual advert here for Equinox:

    https://github.com/patrick-kidger/equinox

    This actually works with JAX's native transformations. (There's no `equinox.vmap` for example.)

    On higher-order functions more generally, Equinox offers a way to control these quite carefully, by making ubiquitous use of callables that are also pytrees. E.g. a neural network is both a callable in that it has a forward pass, and a pytree in that it records its parameters in its tree structure.

What are some alternatives?

When comparing diffrax and equinox you can also consider the following projects:

deepxde - A library for scientific machine learning and physics-informed learning

flax - Flax is a neural network library for JAX that is designed for flexibility.

tiny-cuda-nn - Lightning fast C++/CUDA neural network framework

dm-haiku - JAX-based neural network library

torchtyping - Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.

juliaup - Julia installer and version multiplexer

treex - A Pytree Module system for Deep Learning in JAX

extending-jax - Extending JAX with custom C++ and CUDA code

vectorflow

elegy - A High Level API for Deep Learning in JAX