autograd VS Enzyme

Compare autograd vs Enzyme and see what are their differences.

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
autograd Enzyme
6 16
6,797 1,159
0.7% 1.5%
6.0 9.7
7 days ago 2 days ago
Python LLVM
MIT License GNU General Public License v3.0 or later
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

autograd

Posts with mentions or reviews of autograd. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-09-28.
  • JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
    12 projects | news.ycombinator.com | 28 Sep 2023
    Actually, that's never been a constraint for JAX autodiff. JAX grew out of the original Autograd (https://github.com/hips/autograd), so differentiating through Python control flow always worked. It's jax.jit and jax.vmap which place constraints on control flow, requiring structured control flow combinators like those.
  • Autodidax: Jax Core from Scratch (In Python)
    4 projects | news.ycombinator.com | 11 Feb 2023
    I'm sure there's a lot of good material around, but here are some links that are conceptually very close to the linked Autodidax.

    There's [Autodidact](https://github.com/mattjj/autodidact), a predecessor to Autodidax, which was a simplified implementation of [the original Autograd](https://github.com/hips/autograd). It focuses on reverse-mode autodiff, not building an open-ended transformation system like Autodidax. It's also pretty close to the content in [these lecture slides](https://www.cs.toronto.edu/~rgrosse/courses/csc321_2018/slid...) and [this talk](http://videolectures.net/deeplearning2017_johnson_automatic_...). But the autodiff in Autodidax is more sophisticated and reflects clearer thinking. In particular, Autodidax shows how to implement forward- and reverse-modes using only one set of linearization rules (like in [this paper](https://arxiv.org/abs/2204.10923)).

    Here's [an even smaller and more recent variant](https://gist.github.com/mattjj/52914908ac22d9ad57b76b685d19a...), a single ~100 line file for reverse-mode AD on top of NumPy, which was live-coded during a lecture. There's no explanatory material to go with it though.

  • Numba: A High Performance Python Compiler
    11 projects | news.ycombinator.com | 27 Dec 2022
    XLA is "higher level" than what Numba produces.

    You may be able to get the equivalent of jax via numba+numpy+autograd[1], but I haven't tried it before.

    IMHO, jax is best thought of as a numerical computation library that happens to include autograd, vmapping, pmapping and provides a high level interface for XLA.

    I have built a numerical optimisation library with it, and although a few things became verbose, it was a rather pleasant experience as the natural vmapping made everything a breeze, I didn't have to write the gradients for my testing functions, except for special cases that involved exponents and logs that needed a bit of delicate care.

    [1] https://github.com/HIPS/autograd

  • Run Your Own DALL·E Mini (Craiyon) Server on EC2
    16 projects | dev.to | 26 Jul 2022
    Next, we want the code in the https://github.com/hrichardlee/dalle-playground repo, and we want to construct a pip environment from the backend/requirements.txt file in that repo. We were almost able to use the saharmor/dalle-playground repo as-is, but we had to make one change to add the jax[cuda] package to the requirements.txt file. In case you haven’t seen jax before, jax is a machine-learning library from Google, roughly equivalent to Tensorflow or PyTorch. It combines Autograd for automatic differentiation and XLA (accelerated linear algebra) for JIT-compiling numpy-like code for Google’s TPUs or Nvidia’s CUDA API for GPUs. The CUDA support requires explicitly selecting the [cuda] option when we install the package.
  • Trade-Offs in Automatic Differentiation: TensorFlow, PyTorch, Jax, and Julia
    7 projects | news.ycombinator.com | 25 Dec 2021
    > fun fact, the Jax folks at Google Brain did have a Python source code transform AD at one point but it was scrapped essentially because of these difficulties

    I assume you mean autograd?

    https://github.com/HIPS/autograd

  • JAX - COMPARING WITH THE BIG ONES
    2 projects | /r/CryptocurrencyICO | 6 Sep 2021
    These four points lead to an enormous differentiation in the ecosystem: Keras, for example, was originally thought to be almost completely focused on point (4), leaving the other tasks to a backend engine. In 2015, on the other hand, Autograd focused on the first two points, allowing users to write code using only "classic" Python and NumPy constructs, providing subsequently many options for point (2). Autograd's simplicity greatly influenced the development of the libraries to follow, but it was penalized by the clear lack of the points (3) and (4), i.e. adequate techniques to speed up the code and sufficiently abstract modules for neural network development.

Enzyme

Posts with mentions or reviews of Enzyme. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-12-06.
  • Show HN: Curve Fitting Bezier Curves in WASM with Enzyme Ad
    1 project | news.ycombinator.com | 13 Oct 2023
    Automatic differentiation is done using https://enzyme.mit.edu/
  • Ask HN: What Happened to TensorFlow Swift
    1 project | news.ycombinator.com | 27 May 2023
    lattner left google and was the primary reason they chose swift, so they lost interest.

    if you're asking from an ML perspective, i believe the original motivation was to incorporate automatic differentiation in the swift compiler. i believe enzyme is the spiritual successor.

    https://github.com/EnzymeAD/Enzyme

  • Show HN: Port of OpenAI's Whisper model in C/C++
    9 projects | news.ycombinator.com | 6 Dec 2022
    https://ispc.github.io/ispc.html

    For the auto-differentiation when I need performance or memory, I currently use tapenade ( http://tapenade.inria.fr:8080/tapenade/index.jsp ) and/or manually written gradient when I need to fuse some kernel, but Enzyme ( https://enzyme.mit.edu/ ) is also very promising.

    MPI for parallelization across machines.

  • Do you consider making a physics engine (for RL) worth it?
    3 projects | /r/rust | 8 Oct 2022
    For autodiff, we are currently working again on publishing a new Enzyme (https://enzyme.mit.edu) Frontend for Rust which can also handle pure Rust types, first version should be done in ~ a week.
  • What is a really cool thing you would want to write in Rust but don't have enough time, energy or bravery for?
    21 projects | /r/rust | 8 Jun 2022
    Have you taken a look at enzymeAD? There is a group porting it to rust.
  • The Julia language has a number of correctness flaws
    19 projects | news.ycombinator.com | 16 May 2022
    Enzyme dev here, so take everything I say as being a bit biased:

    While, by design Enzyme is able to run very fast by operating within the compiler (see https://proceedings.neurips.cc/paper/2020/file/9332c513ef44b... for details) -- it aggressively prioritizes correctness. Of course that doesn't mean that there aren't bugs (we're only human and its a large codebase [https://github.com/EnzymeAD/Enzyme], especially if you're trying out newly-added features).

    Notably, this is where the current rough edges for Julia users are -- Enzyme will throw an error saying it couldn't prove correctness, rather than running (there is a flag for "making a best guess, but that's off by default"). The exception to this is garbage collection, for which you can either run a static analysis, or stick to the "officially supported" subset of Julia that Enzyme specifies.

    Incidentally, this is also where being a cross-language tool is really nice -- namely we can see edge cases/bug reports from any LLVM-based language (C/C++, Fortran, Swift, Rust, Python, Julia, etc). So far the biggest code we've handled (and verified correctness for) was O(1million) lines of LLVM from some C++ template hell.

    I will also add that while I absolutely love (and will do everything I can to support) Enzyme being used throughout arbitrary Julia code: in addition to exposing a nice user-facing interface for custom rules in the Enzyme Julia bindings like Chris mentioned, some Julia-specific features (such as full garbage collection support) also need handling in Enzyme.jl, before Enzyme can be considered an "all Julia AD" framework. We are of course working on all of these things (and the more the merrier), but there's only a finite amount of time in the day. [^]

    [^] Incidentally, this is in contrast to say C++/Fortran/Swift/etc, where Enzyme has much closer to whole-language coverage than Julia -- this isn't anything against GC/Julia/etc, but we just have things on our todo list.

  • Jax vs. Julia (Vs PyTorch)
    4 projects | news.ycombinator.com | 4 May 2022
    Idk, Enzyme is pretty next gen, all the way down to LLVM code.

    https://github.com/EnzymeAD/Enzyme

  • What's everyone working on this week (7/2022)?
    15 projects | /r/rust | 14 Feb 2022
    I'm working on merging my build-tool for (oxide)-enzyme into Enzyme itself. Also looking into improving the documentation.
  • Wsmoses/Enzyme: High-performance automatic differentiation of LLVM
    1 project | news.ycombinator.com | 22 Jan 2022
  • Trade-Offs in Automatic Differentiation: TensorFlow, PyTorch, Jax, and Julia
    7 projects | news.ycombinator.com | 25 Dec 2021
    that seems one of the points of enzyme[1], which was mentioned in the article.

    [1] - https://enzyme.mit.edu/

    being able in effect do interprocedural cross language analysis seems awesome.

What are some alternatives?

When comparing autograd and Enzyme you can also consider the following projects:

SwinIR - SwinIR: Image Restoration Using Swin Transformer (official repository)

Zygote.jl - 21st century AD

jaxonnxruntime - A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.

Flux.jl - Relax! Flux is the ML library that doesn't make you tensor

autodidact - A pedagogical implementation of Autograd

Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration

fbpic - Spectral, quasi-3D Particle-In-Cell code, for CPU and GPU

Lux.jl - Explicitly Parameterized Neural Networks in Julia

pure_numba_alias_sampling - Pure numba version of Alias sampling algorithm from L. Devroye's, "Non-Uniform Random Random Variate Generation"

linfa - A Rust machine learning framework.

qha - A Python package for calculating thermodynamic properties under quasi-harmonic approximation, using data from ab-initio calculations

faust - Functional programming language for signal processing and sound synthesis