Julia Differentialequations

Open-source Julia projects categorized as Differentialequations

Top 10 Julia Differentialequation Projects

  • DifferentialEquations.jl

    Multi-language suite for high-performance solvers of differential equations and scientific machine learning (SciML) components. Ordinary differential equations (ODEs), stochastic differential equations (SDEs), delay differential equations (DDEs), differential-algebraic equations (DAEs), and more in Julia.

  • NeuralPDE.jl

    Physics-Informed Neural Networks (PINN) Solvers of (Partial) Differential Equations for Scientific Machine Learning (SciML) accelerated simulation

    Project mention: Automatically install huge number of dependency? | /r/Julia | 2023-05-31

    The documentation has a manifest associated with it: https://docs.sciml.ai/NeuralPDE/dev/#Reproducibility. Instantiating the manifest will give you all of the exact versions used for the documentation build (https://github.com/SciML/NeuralPDE.jl/blob/gh-pages/v5.7.0/assets/Manifest.toml). You just ]instantiate folder_of_manifest. Or you can use the Project.toml.

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

  • DiffEqFlux.jl

    Pre-built implicit layer architectures with O(1) backprop, GPUs, and stiff+non-stiff DE solvers, demonstrating scientific machine learning (SciML) and physics-informed machine learning methods

  • OrdinaryDiffEq.jl

    High performance ordinary differential equation (ODE) and differential-algebraic equation (DAE) solvers, including neural ordinary differential equations (neural ODEs) and scientific machine learning (SciML)

    Project mention: Modern Numerical Solving methods | /r/DifferentialEquations | 2023-07-06

    There has been a lot of research in Runge Kutta methods in the last couple decades which resulted in all kind of specialized Runge Kutta methods. You have high order ones, RK methods for stiff problems, embedded RK methods which benefit from adaprive step size control, RK-Nystrom methods for second order Problems, symplectic RK methods which preserve energy (eg. hamiltonian) ando so on. If you are interested in the numerics and the use cases I highly recommend checking out the Julia Libary OrdinaryDiffEq (https://github.com/SciML/OrdinaryDiffEq.jl). If you look into the documentation you find A LOT of implemented RK methods for all kind of use cases.

  • SciMLSensitivity.jl

    A component of the DiffEq ecosystem for enabling sensitivity analysis for scientific machine learning (SciML). Optimize-then-discretize, discretize-then-optimize, adjoint methods, and more for ODEs, SDEs, DDEs, DAEs, etc.

  • DiffEqBase.jl

    The lightweight Base library for shared types and functionality for defining differential equation and scientific machine learning (SciML) problems

  • ComponentArrays.jl

    Arrays with arbitrarily nested named components.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

  • DiffEqGPU.jl

    GPU-acceleration routines for DifferentialEquations.jl and the broader SciML scientific machine learning ecosystem

    Project mention: 2023 was the year that GPUs stood still | news.ycombinator.com | 2023-12-29

    Indeed, and this year we created a system for compiling ODE code not just optimized CUDA kernels but also OneAPI kernels, AMD GPU kernels, and Metal. Peer reviewed version is here (https://www.sciencedirect.com/science/article/abs/pii/S00457...), open access is here (https://arxiv.org/abs/2304.06835), and the open source code is at https://github.com/SciML/DiffEqGPU.jl. The key that the paper describes is that in this case kernel generation is about 20x-100x faster than PyTorch and Jax (see the Jax compilation in multiple ways in this notebook https://colab.research.google.com/drive/1d7G-O5JX31lHbg7jTzz..., extra overhead though from calling Julia from Python but still shows a 10x).

    The point really is that while deep learning libraries are amazing, at the end of the day they are DSL and really pull towards one specific way of computing and parallelization. It turns out that way of parallelizing is good for deep learning, but not for all things you may want to accelerate. Sometimes (i.e. cases that aren't dominated by large linear algebra) building problem-specific kernels is a major win, and it's over-extrapolating to see ML frameworks do well with GPUs and think that's the only thing that's required. There are many ways to parallelize a code, ML libraries hardcode a very specific way, and it's good for what they are used for but not every problem that can arise.

  • StochasticDiffEq.jl

    Solvers for stochastic differential equations which connect with the scientific machine learning (SciML) ecosystem

  • BoundaryValueDiffEq.jl

    Boundary value problem (BVP) solvers for scientific machine learning (SciML)

NOTE: The open source projects on this list are ordered by number of github stars. The number of mentions indicates repo mentiontions in the last 12 Months or since we started tracking (Dec 2020). The latest post mention was on 2023-12-29.

Julia Differentialequations related posts

Index

What are some of the best open-source Differentialequation projects in Julia? This list will help you:

Project Stars
1 DifferentialEquations.jl 2,729
2 NeuralPDE.jl 891
3 DiffEqFlux.jl 828
4 OrdinaryDiffEq.jl 497
5 SciMLSensitivity.jl 305
6 DiffEqBase.jl 291
7 ComponentArrays.jl 272
8 DiffEqGPU.jl 268
9 StochasticDiffEq.jl 233
10 BoundaryValueDiffEq.jl 39
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com