The Elements of Differentiable Programming

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

InfluxDB – Built for High-Performance Time Series Workloads
InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  1. ForwardDiff.jl

    Forward Mode Automatic Differentiation for Julia

    You seem somewhat obsessed with the idea that reverse-mode autodiff is not the same technique as forward-mode autodiff. It makes you,,, angry? Seems like such a trivial thing to act a complete fool over.

    What's up with that?

    Anyway, here's a forward differentiation package with a file that might interest you

    https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/src/...

  2. InfluxDB

    InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.

    InfluxDB logo
  3. ceres-solver

    A large scale non-linear optimization library

    I can't reply to the guy saying julia is the only one. But there are others.

    Ceres uses dual numbers

    https://github.com/ceres-solver/ceres-solver/blob/master/inc...

    This library from google is used everywhere in robotics, so it's hardly some backwater little side project.

    So does c++ autodiff

  4. autodiff

    automatic differentiation made easier for C++

  5. Pytorch

    Tensors and Dynamic neural networks in Python with strong GPU acceleration

    Sure, right here: https://github.com/pytorch/pytorch/blob/main/torch/autograd/...

    Here's the documentation: https://pytorch.org/tutorials/intermediate/forward_ad_usage....

    > When an input, which we call “primal”, is associated with a “direction” tensor, which we call “tangent”, the resultant new tensor object is called a “dual tensor” for its connection to dual numbers[0].

  6. jax

    Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

    The dual numbers exist just as surely as the real numbers and have been used well over 100 years

    https://en.m.wikipedia.org/wiki/Dual_number

    Pytorch has had them for many years.

    https://pytorch.org/docs/stable/generated/torch.autograd.for...

    JAX implements them and uses them exactly as stated in this thread.

    https://github.com/google/jax/discussions/10157#discussionco...

    As you so eloquently stated, "you shouldn't be proclaiming things you don't actually know on a public forum," and doubly so when your claimed "corrections" are so demonstrably and totally incorrect.

  7. SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • The Julia language has a number of correctness flaws

    19 projects | news.ycombinator.com | 16 May 2022
  • AXLearn: Apple's Deep Learning library built on top of Jax

    1 project | news.ycombinator.com | 18 Feb 2025
  • How to Install & Run VideoLLaMA3-7B Locally

    3 projects | dev.to | 13 Feb 2025
  • An Introduction to Neural Ordinary Differential Equations [pdf]

    2 projects | news.ycombinator.com | 11 Jan 2025
  • PyTorch in the Browser (JavaScript)

    1 project | news.ycombinator.com | 31 Oct 2024

Did you know that C++ is
the 7th most popular programming language
based on number of references?