[D] Ideal deep learning library

This page summarizes the projects mentioned and recommended in the original post on reddit.com/r/MachineLearning

Our great sponsors
  • SonarLint - Deliver Cleaner and Safer Code - Right in Your IDE of Choice!
  • Scout APM - Less time debugging, more time building
  • SaaSHub - Software Alternatives and Reviews
  • equinox

    Callable PyTrees and filtered transforms => neural networks in JAX. https://docs.kidger.site/equinox/

    On the assumption that you're doing something neural network related: have a look at the examples section for one of its deep learning libraries. (e.g. this example trains an RNN on a toy classification problem)

  • functorch

    functorch is JAX-like composable function transforms for PyTorch.

    Fwiw, it’s not like Pytorch’s design prevents function transformations from being implemented. See functorch for an example of grad/vmap function transforms: https://github.com/pytorch/functorch

  • SonarLint

    Deliver Cleaner and Safer Code - Right in Your IDE of Choice!. SonarLint is a free and open source IDE extension that identifies and catches bugs and vulnerabilities as you code, directly in the IDE. Install from your favorite IDE marketplace today.

  • torchsde

    Differentiable SDE solvers with GPU support and efficient sensitivity analysis.

    So not just that paper, but also our follow-up papers on the same topic: Neural SDEs as Infinite-Dimensional GANs Efficient and Accurate Gradients for Neural SDEs are in fact implemented in PyTorch, specifically the torchsde library. (Disclaimer: of which I am a developer.)

  • tsalib

    Tensor Shape Annotation Library (numpy, tensorflow, pytorch, ...)

    The one thing I really *really* wish got more attention was named tensors and the tensor type system. Tensor misalignment errors are a constant source of silently-failing bugs. While 3rd party libraries have attempted to fill this gap, it really needs better native support. In particular it seems like bad form to me for programmers to have to remember the specific alignment and broadcasting rules, and then have to apply them to an often poorly documented order of tensor indices. I'd really like to see something like tsalib's warp operator made part of the main library and generalized to arbitrary function application, like a named-tensor version of fold. But preferably using notation closer to that of torchtyping.

  • torchtyping

    Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.

    The one thing I really *really* wish got more attention was named tensors and the tensor type system. Tensor misalignment errors are a constant source of silently-failing bugs. While 3rd party libraries have attempted to fill this gap, it really needs better native support. In particular it seems like bad form to me for programmers to have to remember the specific alignment and broadcasting rules, and then have to apply them to an often poorly documented order of tensor indices. I'd really like to see something like tsalib's warp operator made part of the main library and generalized to arbitrary function application, like a named-tensor version of fold. But preferably using notation closer to that of torchtyping.

  • dex-lang

    Research language for array processing in the Haskell/ML family

    Probably the most well-developed options I know for this at the moment are Dex and Hasktorch.

  • hasktorch

    Tensors and neural networks in Haskell

    Probably the most well-developed options I know for this at the moment are Dex and Hasktorch.

  • Scout APM

    Less time debugging, more time building. Scout APM allows you to find and fix performance issues with no hassle. Now with error monitoring and external services monitoring, Scout is a developer's best friend when it comes to application development.

  • Roslyn

    The Roslyn .NET compiler provides C# and Visual Basic languages with rich code analysis APIs.

    Thanks for the references, interesting projects (including Equinox). I know that C# is not THE language for ML research and it also lacks variadic generics (and const generics), but they introduced recently something called Source Generators. You can basically generate some C# code based on existing C# code (syntax trees and stuff) and it hooks into static analysis phase. It is integrated with IDE (JetBrains and Visual Studio) and you can define your own warning or error messages. Feels pretty native. Not sure how it compares to Rust macro generators and if there are some roadblocks along the way, but that may be an option to ensure shape type safety at compile time for nd-arrays.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts