NeuralPDE.jl VS ReservoirComputing.jl

Compare NeuralPDE.jl vs ReservoirComputing.jl and see what are their differences.

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
NeuralPDE.jl ReservoirComputing.jl
10 1
903 198
2.8% 1.0%
9.7 8.5
3 days ago 21 days ago
Julia Julia
GNU General Public License v3.0 or later MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

NeuralPDE.jl

Posts with mentions or reviews of NeuralPDE.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-05-26.
  • Automatically install huge number of dependency?
    1 project | /r/Julia | 31 May 2023
    The documentation has a manifest associated with it: https://docs.sciml.ai/NeuralPDE/dev/#Reproducibility. Instantiating the manifest will give you all of the exact versions used for the documentation build (https://github.com/SciML/NeuralPDE.jl/blob/gh-pages/v5.7.0/assets/Manifest.toml). You just ]instantiate folder_of_manifest. Or you can use the Project.toml.
  • from Wolfram Mathematica to Julia
    2 projects | /r/Julia | 26 May 2022
    PDE solving libraries are MethodOfLines.jl and NeuralPDE.jl. NeuralPDE is very general but not very fast (it's a limitation of the method, PINNs are just slow). MethodOfLines is still somewhat under development but generates quite fast code.
  • IA et Calcul scientifique dans Kubernetes avec le langage Julia, K8sClusterManagers.jl
    11 projects | dev.to | 12 Mar 2022
    GitHub - SciML/NeuralPDE.jl: Physics-Informed Neural Networks (PINN) and Deep BSDE Solvers of Differential Equations for Scientific Machine Learning (SciML) accelerated simulation
  • [D] ICLR 2022 RESULTS ARE OUT
    1 project | /r/MachineLearning | 22 Jan 2022
    That doesn't mean there's no use case for PINNs, we wrote a giant review-ish kind of thing on NeuralPDE.jl to describe where PINNs might be useful. It's just... not the best for publishing. It's things like, (a) where you have not already optimized a classical method, (b) need something that's easy to generate solvers for different cases without too much worry about stability, (c) high dimensional PDEs, and (d) surrogates over parameters. (c) and (d) are the two "real" uses cases you can actually publish about, but they aren't quite good for (c) (see mesh-free methods from the old radial basis function literature in comparison) or (d) (there are much faster surrogate techniques). So we are continuing to work on them for (a) and (b) as an interesting option as part of a software suite, but that's not the kind of thing that's really publishable so I don't think we plan to ever submit that article anywhere.
  • [N] Open Colloquium by Prof. Max Welling: "Is the next deep learning disruption in the physical sciences?"
    1 project | /r/MachineLearning | 21 Oct 2021
  • [D] What are some ideas that are hyped up in machine learning research but don't actually get used in industry (and vice versa)?
    1 project | /r/MachineLearning | 16 Oct 2021
    Did this change at all with the advent of Physics Informed Neural Networks? The Julia language has some really impressive tools for that use case. https://github.com/SciML/NeuralPDE.jl
  • [Research] Input Arbitrary PDE -> Output Approximate Solution
    4 projects | /r/MachineLearning | 10 Jul 2021
    PDEs are difficult because you don't have a simple numerical definition over all PDEs because they can be defined by arbitrarily many functions. u' = Laplace u + f? Define f. u' = g(u) * Laplace u + f? Define f and g. Etc. To cover the space of PDEs you have to go symbolic at some point, and make the discretization methods dependent on the symbolic form. This is precisely what the ModelingToolkit.jl ecosystem is doing. One instantiation of a discretizer on this symbolic form is NeuralPDE.jl which takes a symbolic PDESystem and generates an OptimizationProblem for a neural network which represents the solution via a Physics-Informed Neural Network (PINN).
  • [D] Has anyone worked with Physics Informed Neural Networks (PINNs)?
    3 projects | /r/MachineLearning | 21 May 2021
    NeuralPDE.jl fully automates the approach (and extensions of it, which are required to make it solve practical problems) from symbolic descriptions of PDEs, so that might be a good starting point to both learn the practical applications and get something running in a few minutes. As part of MIT 18.337 Parallel Computing and Scientific Machine Learning I gave an early lecture on physics-informed neural networks (with a two part video) describing the approach, how it works and what its challenges are. You might find those resources enlightening.
  • Doing Symbolic Math with SymPy
    8 projects | news.ycombinator.com | 8 Jan 2021
    What is great about ModelingToolkit.jl is how its used in practical ways for other packages. E.g. NeuralPDE.jl.

    Compared to SymPy, I feel that it is less of a "how do I integrate this function" package and more about "how can I build this DSL" framework.

    https://github.com/SciML/NeuralPDE.jl

ReservoirComputing.jl

Posts with mentions or reviews of ReservoirComputing.jl. We have used some of these posts to build our list of alternatives and similar projects.
  • Scientists develop the next generation of reservoir computing
    1 project | news.ycombinator.com | 22 Sep 2021
    Not just similar, the same. If you look through the documentation you'll see that https://github.com/SciML/ReservoirComputing.jl is a collection of reservoir architectures with high performance implementations, and some of our recent research has been pulling reservoir computing to the continuous domain for stiff ODEs (think of it almost like a neural ODE that you do not need to train via gradient descent): https://arxiv.org/abs/2010.04004 . We are definitely digging through this paper with some fascination and will incorporate a lot of its advancements into the software.

What are some alternatives?

When comparing NeuralPDE.jl and ReservoirComputing.jl you can also consider the following projects:

deepxde - A library for scientific machine learning and physics-informed learning

DifferentialEquations.jl - Multi-language suite for high-performance solvers of differential equations and scientific machine learning (SciML) components. Ordinary differential equations (ODEs), stochastic differential equations (SDEs), delay differential equations (DDEs), differential-algebraic equations (DAEs), and more in Julia.

SymPy - A computer algebra system written in pure Python

Catalyst.jl - Chemical reaction network and systems biology interface for scientific machine learning (SciML). High performance, GPU-parallelized, and O(1) solvers in open source software.

ModelingToolkit.jl - An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations

AMDGPU.jl - AMD GPU (ROCm) programming in Julia

DiffEqOperators.jl - Linear operators for discretizations of differential equations and scientific machine learning (SciML)

18337 - 18.337 - Parallel Computing and Scientific Machine Learning

Gridap.jl - Grid-based approximation of partial differential equations in Julia

auto-07p - AUTO is a publicly available software for continuation and bifurcation problems in ordinary differential equations originally written in 1980 and widely used in the dynamical systems community.

Pyston - A faster and highly-compatible implementation of the Python programming language.