tangent

Source-to-Source Debuggable Derivatives in Pure Python (by google)

Tangent Alternatives

Similar projects and alternatives to tangent based on common topics and language

  • julia

    350 tangent VS julia

    The Julia Programming Language

  • dex-lang

    25 tangent VS dex-lang

    Research language for array processing in the Haskell/ML family

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • Enzyme

    High-performance automatic differentiation of LLVM and MLIR. (by EnzymeAD)

  • autograd

    6 tangent VS autograd

    Efficiently computes derivatives of numpy code.

  • kotlingrad

    🧩 Shape-Safe Symbolic Differentiation with Algebraic Data Types

  • SmallPebble

    Minimal deep learning library written from scratch in Python, using NumPy/CuPy.

  • pennylane

    PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better tangent alternative or higher similarity.

tangent reviews and mentions

Posts with mentions or reviews of tangent. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-12-25.
  • [D] How AD is implemented in JAX/Tensorflow/Pytorch?
    1 project | /r/MachineLearning | 25 Dec 2021
    Thank you so much for the detail explaination! This remind me of tangent, an abandoned (?) SCT built by google couple of years ago. https://github.com/google/tangent
  • Trade-Offs in Automatic Differentiation: TensorFlow, PyTorch, Jax, and Julia
    7 projects | news.ycombinator.com | 25 Dec 2021
    No, autograd acts similarly to PyTorch in that it builds a tape that it reverses while PyTorch just comes with more optimized kernels (and kernels that act on GPUs). The AD that I was referencing was tangent (https://github.com/google/tangent). It was an interesting project but it's hard to see who the audience is. Generating Python source code makes things harder to analyze, and you cannot JIT compile the generated code unless you could JIT compile Python. So you might as well first trace to a JIT-compliable sublanguage and do the actions there, which is precisely what Jax does. In theory tangent is a bit more general, and maybe you could mix it with Numba, but then it's hard to justify. If it's more general then it's not for the standard ML community for the same reason as the Julia tools, but then it better do better than the Julia tools in the specific niche that they are targeting. Jax just makes much more sense for the people who were building it, it chose its niche very well.

Stats

Basic tangent repo stats
2
2,280
10.0
over 1 year ago

google/tangent is an open source project licensed under Apache License 2.0 which is an OSI approved license.

The primary programming language of tangent is Python.

Popular Comparisons


Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com