frugally-deep VS equinox

Compare frugally-deep vs equinox and see what are their differences.

equinox

Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/ (by patrick-kidger)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
frugally-deep equinox
5 31
1,045 1,819
- -
8.0 9.2
13 days ago 15 days ago
C++ Python
MIT License Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

frugally-deep

Posts with mentions or reviews of frugally-deep. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-03.

equinox

Posts with mentions or reviews of equinox. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-03.
  • Ask HN: What side projects landed you a job?
    62 projects | news.ycombinator.com | 3 Dec 2023
    I wrote a JAX-based neural network library (Equinox [1]) and numerical differential equation solving library (Diffrax [2]).

    At the time I was just exploring some new research ideas in numerics -- and frankly, procrastinating from writing up my PhD thesis!

    But then one of the teams at Google starting using them, so they offered me a job to keep developing them for their needs. Plus I'd get to work in biotech, which was a big interest of mine. This was a clear dream job offer, so I accepted.

    Since then both have grown steadily in popularity (~2.6k GitHub stars) and now see pretty widespread use! I've since started writing several other JAX libraries and we now have a bit of an ecosystem going.

    [1] https://github.com/patrick-kidger/equinox

  • [P] Optimistix, nonlinear optimisation in JAX+Equinox!
    3 projects | /r/MachineLearning | 14 Oct 2023
    The elevator pitch is Optimistix is really fast, especially to compile. It plays nicely with Optax for first-order gradient-based methods, and takes a lot of design inspiration from Equinox, representing the state of all the solvers as standard JAX PyTrees.
  • JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
    12 projects | news.ycombinator.com | 28 Sep 2023
    If you like PyTorch then you might like Equinox, by the way. (https://github.com/patrick-kidger/equinox ; 1.4k GitHub stars now!)
  • Equinox: Elegant easy-to-use neural networks in Jax
    1 project | news.ycombinator.com | 18 Sep 2023
  • Show HN: Equinox (1.3k stars), a JAX library for neural networks and sciML
    1 project | news.ycombinator.com | 5 Sep 2023
  • Pytrees
    2 projects | news.ycombinator.com | 22 May 2023
    You're thinking of `jax.closure_convert`. :)

    (Although technically that works by tracing and extracting all constants from the jaxpr, rather than introspecting the function's closure cells -- it sounds like your trick is the latter.)

    When you discuss dynamic allocation, I'm guessing you're mainly referring to not being able to backprop through `jax.lax.while_loop`. If so, you might find `equinox.internal.while_loop` interesting, which is an unbounded while loop that you can backprop through! The secret sauce is to use a treeverse-style checkpointing scheme.

    https://github.com/patrick-kidger/equinox/blob/f95a8ba13fb35...

  • Writing Python like it’s Rust
    4 projects | /r/rust | 20 May 2023
    I'm a big fan of using ABCs to declare interfaces -- so much so that I have an improved abc.ABCMeta that also handles abstract instance variables and abstract class variables: https://github.com/patrick-kidger/equinox/blob/main/equinox/_better_abstract.py
  • [D] JAX vs PyTorch in 2023
    5 projects | /r/MachineLearning | 9 Mar 2023
    For the daily research, I use Equinox (https://github.com/patrick-kidger/equinox) as a DL librarry in JAX.
  • [Machinelearning] [D] État actuel de JAX vs Pytorch?
    1 project | /r/enfrancais | 24 Feb 2023
  • Training Deep Networks with Data Parallelism in Jax
    6 projects | news.ycombinator.com | 24 Feb 2023
    It sounds like you're concerned about how downstream libraries tend to wrap JAX transformations to handle their own thing? (E.g. `haiku.grad`.)

    If so, then allow me to make my usual advert here for Equinox:

    https://github.com/patrick-kidger/equinox

    This actually works with JAX's native transformations. (There's no `equinox.vmap` for example.)

    On higher-order functions more generally, Equinox offers a way to control these quite carefully, by making ubiquitous use of callables that are also pytrees. E.g. a neural network is both a callable in that it has a forward pass, and a pytree in that it records its parameters in its tree structure.

What are some alternatives?

When comparing frugally-deep and equinox you can also consider the following projects:

Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration

flax - Flax is a neural network library for JAX that is designed for flexibility.

tensorflow - An Open Source Machine Learning Framework for Everyone

dm-haiku - JAX-based neural network library

tiny-cnn - header only, dependency-free deep learning framework in C++14

torchtyping - Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.

Taskflow - A General-purpose Parallel and Heterogeneous Task Programming System

treex - A Pytree Module system for Deep Learning in JAX

Genann - simple neural network library in ANSI C

extending-jax - Extending JAX with custom C++ and CUDA code

nano

diffrax - Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/