Our great sponsors
-
DataDrivenDiffEq.jl
Data driven modeling and automated discovery of dynamical systems for the SciML Scientific Machine Learning organization
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
NeuralPDE.jl
Physics-Informed Neural Networks (PINN) Solvers of (Partial) Differential Equations for Scientific Machine Learning (SciML) accelerated simulation
NeuralPDE.jl fully automates the approach (and extensions of it, which are required to make it solve practical problems) from symbolic descriptions of PDEs, so that might be a good starting point to both learn the practical applications and get something running in a few minutes. As part of MIT 18.337 Parallel Computing and Scientific Machine Learning I gave an early lecture on physics-informed neural networks (with a two part video) describing the approach, how it works and what its challenges are. You might find those resources enlightening.
This is all not to mention the fact that PINNs are a notoriously computationally intensive approach, where it's pretty easy to show the differentiable solver approach of DiffEqFlux.jl achieves about a 10,000x speedup over another PINN package on parameter estimation of Lorenz equations, and while it scales to higher PDE dimensions well, it doesn't scale to larger systems of PDEs very well. You'll want to factor in a good chunk of training time, and of course increase that by a few orders of magnitude if your dynamics are stiff. Altogether, without knowing your exact problem it's hard to give a rough idea of how practical it would be, but if I tasked a beginning graduate student with trying this out on some of the biological PDEs I work with, then I would give them about 4-6 months to get something decent together.
NeuralPDE.jl fully automates the approach (and extensions of it, which are required to make it solve practical problems) from symbolic descriptions of PDEs, so that might be a good starting point to both learn the practical applications and get something running in a few minutes. As part of MIT 18.337 Parallel Computing and Scientific Machine Learning I gave an early lecture on physics-informed neural networks (with a two part video) describing the approach, how it works and what its challenges are. You might find those resources enlightening.
Related posts
- Composability in Julia: Implementing Deep Equilibrium Models via Neural Odes
- SciML Textbook
- Why Fortran is a scientific powerhouse
- How much useful are Runge-Kutta methods of order 9 and higher within double-precision arithmetic/floating point accuracy?
- Interpolant Coefficients for the BS5 Runge-Kutta method