SaaSHub helps you find the best software and product alternatives Learn more →
SciMLBenchmarks.jl Alternatives
Similar projects and alternatives to SciMLBenchmarks.jl
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
-
-
-
ffi-overhead
comparing the c ffi (foreign function interface) overhead on various programming languages
-
-
DifferentialEquations.jl
Multi-language suite for high-performance solvers of differential equations and scientific machine learning (SciML) components. Ordinary differential equations (ODEs), stochastic differential equations (SDEs), delay differential equations (DDEs), differential-algebraic equations (DAEs), and more in Julia.
-
ModelingToolkit.jl
An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
-
-
-
-
Metatheory.jl
Makes Julia reason with equations. General purpose metaprogramming, symbolic computation and algebraic equational reasoning library for the Julia programming language: E-Graphs & equality saturation, term rewriting and more.
-
SciMLTutorials.jl
Tutorials for doing scientific machine learning (SciML) and high-performance differential equation solving with open source software.
-
-
-
-
-
-
SciMLBenchmarks.jl discussion
SciMLBenchmarks.jl reviews and mentions
- Can Fortran survive another 15 years?
-
Why Fortran is a scientific powerhouse
Project.toml or Manifest.toml? Every package has Project.toml which specifies bounds (https://github.com/SciML/OrdinaryDiffEq.jl/blob/master/Proje...). Every fully reproducible project has a Manifest that decrease the complete package state (https://github.com/SciML/SciMLBenchmarks.jl/blob/master/benc...).
-
Why Fortran is easy to learn
> But in the end, it's FORTRAN all the way down. Even in Julia.
That's not true. None of the Julia differential equation solver stack is calling into Fortran anymore. We have our own BLAS tools that outperform OpenBLAS and MKL in the instances we use it for (mostly LU-factorization) and those are all written in pure Julia. See https://github.com/YingboMa/RecursiveFactorization.jl, https://github.com/JuliaSIMD/TriangularSolve.jl, and https://github.com/JuliaLinearAlgebra/Octavian.jl. And this is one part of the DiffEq performance story. The performance of this of course is all validated on https://github.com/SciML/SciMLBenchmarks.jl
-
Twitter Thread: Symbolic Computing for Compiler Optimizations in Julia
Anything that continues to improve the SciMLBenchmarks of differential equation solvers, inverse problems, scientific machine learning, and equation discovery really. But there's a lot of other applications in mind, like generating compiler passes that improve floating point roundoff (like Herbie), a pure-Julia simple implementation of XLA-transformations for BLAS fusion, and a few others that are a bit more out there and will require a paper to describe the connection.
-
In 2022, the difference between symbolic computing and compiler optimizations will be erased in #julialang. Anyone who can come up with a set of symbolic mathematical rules will automatically receive an optimized compiler pass to build better code
Show me a single DAE solver in Haskell that has even come close to the performance we get in the Julia SciMLBenchmarks. Here's just one example. For Haskell pacakages, all I see are wrappers to GSL and Sundials, both of which are slow in comparison. So this is a 8.5x speedup over something that was already faster than what you could find in Haskell. Show me something with decent speed in DAEs or it's useless.
-
Tutorials for Learning Runge-Kutta Methods with Julia?
That's both a joke and a truth. The DifferentialEquations.jl source code, along with the SciMLBenchmarks and all of the associated documentation, is by far the most complete resource on all of this stuff at this point, for a reason. I've always treated it as "a lab notebook for the community" which is why that 8,000 lines of tableau code, the thousands of convergence tests, etc. are there. Papers have typos sometimes, things change with benchmarks over time, etc. But well-tested code tells you whether something actually converges and what the true performance is today.
-
[D] How important is Numerical Analysis for machine learning?
Star-P was sold off to Microsoft IIRC. Some of the people who had interned there then joined Alan's lab. They created the Julia programming language where now parallelism and performance is directly built into the language. I created the differential equation solver libraries for the language which then used all of these properties to benchmark very well, and that's how I subsequently started working with Alan. Then we took this to build systems that combine machine learning and numerical solvers to accelerate and automatically discover physical systems, and the resulting SciML organization and the scientific machine learning research, along with compiler-level automatic differentiation and parallelism, is where all of that is today with the Julia Lab.
- Julia 1.7 has been released
-
Is Julia suitable for computational physics?
Most of the SciML organization is dedicated to research and production level scientific computing for domains like physical systems, chemical reactions, and systems biology (and more of course). The differential equation benchmarks are quite good in comparison to a lot of C++ and Fortran libraries, there's modern neural PDE solvers, pervasive automatic differentiation, automated GPU and distributed parallelism, SDE solvers, DDE solvers, DAE solvers, ModelingToolkit.jl for Modelica-like symbolic transformations for higher index DAEs, Bayesian differential equations, etc. All of that then ties into big PDE solving. You get the picture.
-
A note from our sponsor - SaaSHub
www.saashub.com | 2 Dec 2024
Stats
SciML/SciMLBenchmarks.jl is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of SciMLBenchmarks.jl is MATLAB.
Popular Comparisons
- SciMLBenchmarks.jl VS DifferentialEquations.jl
- SciMLBenchmarks.jl VS SciMLTutorials.jl
- SciMLBenchmarks.jl VS RecursiveFactorization.jl
- SciMLBenchmarks.jl VS Metatheory.jl
- SciMLBenchmarks.jl VS Diffractor.jl
- SciMLBenchmarks.jl VS ApproxFun.jl
- SciMLBenchmarks.jl VS julia
- SciMLBenchmarks.jl VS MPI.jl
- SciMLBenchmarks.jl VS BoundaryValueDiffEq.jl
- SciMLBenchmarks.jl VS Octavian.jl