SciMLBenchmarks.jl

Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R (by SciML)

SciMLBenchmarks.jl Alternatives

Similar projects and alternatives to SciMLBenchmarks.jl

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better SciMLBenchmarks.jl alternative or higher similarity.

SciMLBenchmarks.jl discussion

Log in or Post with

SciMLBenchmarks.jl reviews and mentions

Posts with mentions or reviews of SciMLBenchmarks.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-05-01.
  • Can Fortran survive another 15 years?
    7 projects | news.ycombinator.com | 1 May 2023
  • Why Fortran is a scientific powerhouse
    2 projects | news.ycombinator.com | 11 Jan 2023
    Project.toml or Manifest.toml? Every package has Project.toml which specifies bounds (https://github.com/SciML/OrdinaryDiffEq.jl/blob/master/Proje...). Every fully reproducible project has a Manifest that decrease the complete package state (https://github.com/SciML/SciMLBenchmarks.jl/blob/master/benc...).
  • Why Fortran is easy to learn
    19 projects | news.ycombinator.com | 7 Jan 2022
    > But in the end, it's FORTRAN all the way down. Even in Julia.

    That's not true. None of the Julia differential equation solver stack is calling into Fortran anymore. We have our own BLAS tools that outperform OpenBLAS and MKL in the instances we use it for (mostly LU-factorization) and those are all written in pure Julia. See https://github.com/YingboMa/RecursiveFactorization.jl, https://github.com/JuliaSIMD/TriangularSolve.jl, and https://github.com/JuliaLinearAlgebra/Octavian.jl. And this is one part of the DiffEq performance story. The performance of this of course is all validated on https://github.com/SciML/SciMLBenchmarks.jl

  • Twitter Thread: Symbolic Computing for Compiler Optimizations in Julia
    3 projects | /r/Julia | 3 Jan 2022
    Anything that continues to improve the SciMLBenchmarks of differential equation solvers, inverse problems, scientific machine learning, and equation discovery really. But there's a lot of other applications in mind, like generating compiler passes that improve floating point roundoff (like Herbie), a pure-Julia simple implementation of XLA-transformations for BLAS fusion, and a few others that are a bit more out there and will require a paper to describe the connection.
  • In 2022, the difference between symbolic computing and compiler optimizations will be erased in #julialang. Anyone who can come up with a set of symbolic mathematical rules will automatically receive an optimized compiler pass to build better code
    3 projects | /r/programmingcirclejerk | 2 Jan 2022
    Show me a single DAE solver in Haskell that has even come close to the performance we get in the Julia SciMLBenchmarks. Here's just one example. For Haskell pacakages, all I see are wrappers to GSL and Sundials, both of which are slow in comparison. So this is a 8.5x speedup over something that was already faster than what you could find in Haskell. Show me something with decent speed in DAEs or it's useless.
  • Tutorials for Learning Runge-Kutta Methods with Julia?
    5 projects | /r/Julia | 27 Dec 2021
    That's both a joke and a truth. The DifferentialEquations.jl source code, along with the SciMLBenchmarks and all of the associated documentation, is by far the most complete resource on all of this stuff at this point, for a reason. I've always treated it as "a lab notebook for the community" which is why that 8,000 lines of tableau code, the thousands of convergence tests, etc. are there. Papers have typos sometimes, things change with benchmarks over time, etc. But well-tested code tells you whether something actually converges and what the true performance is today.
  • [D] How important is Numerical Analysis for machine learning?
    2 projects | /r/MachineLearning | 23 Dec 2021
    Star-P was sold off to Microsoft IIRC. Some of the people who had interned there then joined Alan's lab. They created the Julia programming language where now parallelism and performance is directly built into the language. I created the differential equation solver libraries for the language which then used all of these properties to benchmark very well, and that's how I subsequently started working with Alan. Then we took this to build systems that combine machine learning and numerical solvers to accelerate and automatically discover physical systems, and the resulting SciML organization and the scientific machine learning research, along with compiler-level automatic differentiation and parallelism, is where all of that is today with the Julia Lab.
  • Julia 1.7 has been released
    15 projects | news.ycombinator.com | 30 Nov 2021
  • Is Julia suitable for computational physics?
    4 projects | /r/Julia | 5 Jan 2021
    Most of the SciML organization is dedicated to research and production level scientific computing for domains like physical systems, chemical reactions, and systems biology (and more of course). The differential equation benchmarks are quite good in comparison to a lot of C++ and Fortran libraries, there's modern neural PDE solvers, pervasive automatic differentiation, automated GPU and distributed parallelism, SDE solvers, DDE solvers, DAE solvers, ModelingToolkit.jl for Modelica-like symbolic transformations for higher index DAEs, Bayesian differential equations, etc. All of that then ties into big PDE solving. You get the picture.
  • A note from our sponsor - SaaSHub
    www.saashub.com | 2 Dec 2024
    SaaSHub helps you find the best software and product alternatives Learn more →

Stats

Basic SciMLBenchmarks.jl repo stats
10
319
9.7
8 days ago

Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com