NeuralPDE.jl
18337
Our great sponsors
NeuralPDE.jl | 18337 | |
---|---|---|
10 | 14 | |
903 | 189 | |
2.6% | 7.4% | |
9.7 | 5.7 | |
3 days ago | about 1 year ago | |
Julia | Jupyter Notebook | |
GNU General Public License v3.0 or later | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
NeuralPDE.jl
-
Automatically install huge number of dependency?
The documentation has a manifest associated with it: https://docs.sciml.ai/NeuralPDE/dev/#Reproducibility. Instantiating the manifest will give you all of the exact versions used for the documentation build (https://github.com/SciML/NeuralPDE.jl/blob/gh-pages/v5.7.0/assets/Manifest.toml). You just ]instantiate folder_of_manifest. Or you can use the Project.toml.
-
from Wolfram Mathematica to Julia
PDE solving libraries are MethodOfLines.jl and NeuralPDE.jl. NeuralPDE is very general but not very fast (it's a limitation of the method, PINNs are just slow). MethodOfLines is still somewhat under development but generates quite fast code.
-
IA et Calcul scientifique dans Kubernetes avec le langage Julia, K8sClusterManagers.jl
GitHub - SciML/NeuralPDE.jl: Physics-Informed Neural Networks (PINN) and Deep BSDE Solvers of Differential Equations for Scientific Machine Learning (SciML) accelerated simulation
-
[D] ICLR 2022 RESULTS ARE OUT
That doesn't mean there's no use case for PINNs, we wrote a giant review-ish kind of thing on NeuralPDE.jl to describe where PINNs might be useful. It's just... not the best for publishing. It's things like, (a) where you have not already optimized a classical method, (b) need something that's easy to generate solvers for different cases without too much worry about stability, (c) high dimensional PDEs, and (d) surrogates over parameters. (c) and (d) are the two "real" uses cases you can actually publish about, but they aren't quite good for (c) (see mesh-free methods from the old radial basis function literature in comparison) or (d) (there are much faster surrogate techniques). So we are continuing to work on them for (a) and (b) as an interesting option as part of a software suite, but that's not the kind of thing that's really publishable so I don't think we plan to ever submit that article anywhere.
- [N] Open Colloquium by Prof. Max Welling: "Is the next deep learning disruption in the physical sciences?"
-
[D] What are some ideas that are hyped up in machine learning research but don't actually get used in industry (and vice versa)?
Did this change at all with the advent of Physics Informed Neural Networks? The Julia language has some really impressive tools for that use case. https://github.com/SciML/NeuralPDE.jl
-
[Research] Input Arbitrary PDE -> Output Approximate Solution
PDEs are difficult because you don't have a simple numerical definition over all PDEs because they can be defined by arbitrarily many functions. u' = Laplace u + f? Define f. u' = g(u) * Laplace u + f? Define f and g. Etc. To cover the space of PDEs you have to go symbolic at some point, and make the discretization methods dependent on the symbolic form. This is precisely what the ModelingToolkit.jl ecosystem is doing. One instantiation of a discretizer on this symbolic form is NeuralPDE.jl which takes a symbolic PDESystem and generates an OptimizationProblem for a neural network which represents the solution via a Physics-Informed Neural Network (PINN).
-
[D] Has anyone worked with Physics Informed Neural Networks (PINNs)?
NeuralPDE.jl fully automates the approach (and extensions of it, which are required to make it solve practical problems) from symbolic descriptions of PDEs, so that might be a good starting point to both learn the practical applications and get something running in a few minutes. As part of MIT 18.337 Parallel Computing and Scientific Machine Learning I gave an early lecture on physics-informed neural networks (with a two part video) describing the approach, how it works and what its challenges are. You might find those resources enlightening.
-
Doing Symbolic Math with SymPy
What is great about ModelingToolkit.jl is how its used in practical ways for other packages. E.g. NeuralPDE.jl.
Compared to SymPy, I feel that it is less of a "how do I integrate this function" package and more about "how can I build this DSL" framework.
https://github.com/SciML/NeuralPDE.jl
18337
- Hello I wanted to know what would be the best way to get started in Julia and artificial intelligence. I looked around alot of different languages and saw Julia was good for data science and for artificial intelligence but would like to know what would be good ways to just do it. Thank you
-
SciML/SciMLBook: Parallel Computing and Scientific Machine Learning (SciML): Methods and Applications (MIT 18.337J/6.338J)
This was previously the https://github.com/mitmath/18337 course website, but now in a new iteration of the course it is being reset. To avoid issues like this in the future, we have moved the "book" out to its own repository, https://github.com/SciML/SciMLBook, where it can continue to grow and be hosted separately from the structure of a course. This means it can be something other courses can depend on as well. I am looking for web developers who can help build a nicer webpage for this book, and also for the SciMLBenchmarks.
-
Why Fortran is easy to learn
I would say Fortran is a pretty great language for teaching beginners in numerical analysis courses. The only issue I have with it is that, similar to using C+MPI (which is what I first learned with, well after a bit of Java), the students don't tend to learn how to go "higher level". You teach them how to write a three loop matrix-matrix multiplication, but the next thing you should teach is how to use higher level BLAS tools and why that will outperform the 3-loop form. But Fortran then becomes very cumbersome (`dgemm` etc.) so students continue to write simple loops and simple algorithms where they shouldn't. A first numerical analysis course should teach simple algorithms AND why the simple algorithms are not good, but a lot of instructors and tools fail to emphasize the second part of that statement.
On the other hand, the performance + high level nature of Julia makes it a rather excellent tool for this. In MIT graduate course 18.337 Parallel Computing and Scientific Machine Learning (https://github.com/mitmath/18337) we do precisely that, starting with direct optimization of loops, then moving to linear algebra, ODE solving, and implementing automatic differentiation. I don't think anyone would want to give a homework assignment to implement AD in Fortran, but in Julia you can do that as something shortly after looking at loop performance and SIMD, and that's really something special. Steven Johnson's 18.335 graduate course in Numerical Analysis (https://github.com/mitmath/18335) showcases some similar niceties. I really like this demonstration where it starts from scratch with the 3 loops and shows how SIMD and cache-oblivious algorithms build towards BLAS performance, and why most users should ultimately not be writing such loops (https://nbviewer.org/github/mitmath/18335/blob/master/notes/...) and should instead use the built-in `mul!` in most scenarios. There's very few languages where such "start to finish" demonstrations can really be showcased in a nice clear fashion.
-
What are some interesting papers to read?
And why not take a course while you're at it.
- Composability in Julia: Implementing Deep Equilibrium Models via Neural Odes
- [2109.12449] AbstractDifferentiation.jl: Backend-Agnostic Differentiable Programming in Julia
-
Is that true?
Here's a good one. It's in Julia but it should do the trick. The main instructor is the most prolific Julia dev in the world.
-
[D] Has anyone worked with Physics Informed Neural Networks (PINNs)?
NeuralPDE.jl fully automates the approach (and extensions of it, which are required to make it solve practical problems) from symbolic descriptions of PDEs, so that might be a good starting point to both learn the practical applications and get something running in a few minutes. As part of MIT 18.337 Parallel Computing and Scientific Machine Learning I gave an early lecture on physics-informed neural networks (with a two part video) describing the approach, how it works and what its challenges are. You might find those resources enlightening.
-
[P] Machine Learning in Physics?
It's a very thriving field. If you are interested in methods research and want to learn some of the techniques behind it, I would recommend taking a dive into my lecture notes as I taught a graduate course at MIT, 18.337 Parallel Computing and Scientific Machine Learning, specifically designed to get new students onboarded into this research program.
- MIT 18.337J: Parallel Computing and Scientific Machine Learning
What are some alternatives?
deepxde - A library for scientific machine learning and physics-informed learning
DataDrivenDiffEq.jl - Data driven modeling and automated discovery of dynamical systems for the SciML Scientific Machine Learning organization
SymPy - A computer algebra system written in pure Python
Vulpix - Fast, unopinionated, minimalist web framework for .NET core inspired by express.js
ModelingToolkit.jl - An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
SciMLTutorials.jl - Tutorials for doing scientific machine learning (SciML) and high-performance differential equation solving with open source software.
ReservoirComputing.jl - Reservoir computing utilities for scientific machine learning (SciML)
GPUCompiler.jl - Reusable compiler infrastructure for Julia GPU backends.
AMDGPU.jl - AMD GPU (ROCm) programming in Julia
MPI.jl - MPI wrappers for Julia
Gridap.jl - Grid-based approximation of partial differential equations in Julia
BenchmarkTools.jl - A benchmarking framework for the Julia language