Why Fortran is easy to learn

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

CodeRabbit: AI Code Reviews for Developers
Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.
coderabbit.ai
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  1. Octavian.jl

    Multi-threaded BLAS-like library that provides pure Julia matrix multiplication

    > But in the end, it's FORTRAN all the way down. Even in Julia.

    That's not true. None of the Julia differential equation solver stack is calling into Fortran anymore. We have our own BLAS tools that outperform OpenBLAS and MKL in the instances we use it for (mostly LU-factorization) and those are all written in pure Julia. See https://github.com/YingboMa/RecursiveFactorization.jl, https://github.com/JuliaSIMD/TriangularSolve.jl, and https://github.com/JuliaLinearAlgebra/Octavian.jl. And this is one part of the DiffEq performance story. The performance of this of course is all validated on https://github.com/SciML/SciMLBenchmarks.jl

  2. CodeRabbit

    CodeRabbit: AI Code Reviews for Developers. Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.

    CodeRabbit logo
  3. RecursiveFactorization.jl

    Julia defaults to OpenBLAS but libblastrampoline makes it so that `using MKL` flips it to MKL on the fly. See the JuliaCon video for more details on that (https://www.youtube.com/watch?v=t6hptekOR7s). The recursive comparison is against OpenBLAS/LAPACK and MKL, see this PR for some (older) details: https://github.com/YingboMa/RecursiveFactorization.jl/pull/2... . What it really comes down to in the end is that OpenBLAS is rather bad, and MKL is optimized for Intel CPUs but not for AMD CPUs, so when the best CPUs are now all AMD CPUs, having a new set of BLAS tools and mixing that with recursive LAPACK tools is either as good or better on most modern systems. Then we see this in practice even when we build BLAS into Sundials for 1,000 ODE chemical reaction networks (https://benchmarks.sciml.ai/html/Bio/BCR.html).

  4. julia

    The Julia Programming Language

    Yeah no worries. You might want to join in the community at least because this kind of discussion is par for the course on the Slack. You might find a nice home here!

    > though its threading implementation may not be robust

    That's one of the main issues I have with it. Its poor threading behavior coupled with a bad default (https://github.com/JuliaLang/julia/issues/33409 one of my biggest pet peeve issues right now) really degrades real user performance.

  5. quickjs

    Public repository of the QuickJS Javascript Engine.

    Ancient LISP and BASIC variants also stand out in this regard of not requiring you to undergo years of training and theory to implement and fully understand.

    Fabrice Bellard's minimalist but complete quickjs.c Javascript interpreter is presently about 54,000 lines of C:

      https://github.com/bellard/quickjs/blob/master/quickjs.c

  6. TriangularSolve.jl

    rdiv!(::AbstractMatrix, ::UpperTriangular) and ldiv!(::LowerTriangular, ::AbstractMatrix)

    > But in the end, it's FORTRAN all the way down. Even in Julia.

    That's not true. None of the Julia differential equation solver stack is calling into Fortran anymore. We have our own BLAS tools that outperform OpenBLAS and MKL in the instances we use it for (mostly LU-factorization) and those are all written in pure Julia. See https://github.com/YingboMa/RecursiveFactorization.jl, https://github.com/JuliaSIMD/TriangularSolve.jl, and https://github.com/JuliaLinearAlgebra/Octavian.jl. And this is one part of the DiffEq performance story. The performance of this of course is all validated on https://github.com/SciML/SciMLBenchmarks.jl

  7. SciMLBenchmarks.jl

    Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R

    > But in the end, it's FORTRAN all the way down. Even in Julia.

    That's not true. None of the Julia differential equation solver stack is calling into Fortran anymore. We have our own BLAS tools that outperform OpenBLAS and MKL in the instances we use it for (mostly LU-factorization) and those are all written in pure Julia. See https://github.com/YingboMa/RecursiveFactorization.jl, https://github.com/JuliaSIMD/TriangularSolve.jl, and https://github.com/JuliaLinearAlgebra/Octavian.jl. And this is one part of the DiffEq performance story. The performance of this of course is all validated on https://github.com/SciML/SciMLBenchmarks.jl

  8. Fortran-code-on-GitHub

    Directory of Fortran codes on GitHub, arranged by topic

    There's modern stuff being written in astro(nomy/physics) (I can attest to some of the codebases listed in https://github.com/Beliavsky/Fortran-code-on-GitHub#astrophy... being modern, at least in terms of development), but I'd say C++ likely does have the upper hand for newer codebases (unless things have changed dramatically last time I looked, algorithms that don't nicely align with nd-arrays are still painful in Fortran).

    I've also heard rumours of Julia and even Rust being used (the latter because of the ability to reuse libraries in the browser e.g. for visualisation), but the writers of these codebases (and the Fortran/C/C++/Java) are unusual—Python and R (and for some holdouts, IDL) are what are most people write in (even if those languages call something else).

  9. SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
  10. 18337

    18.337 - Parallel Computing and Scientific Machine Learning

  11. 18335

    18.335 - Introduction to Numerical Methods course

  12. BLIS.jl

    This repo plans to provide a low-level Julia wrapper for BLIS typed interface.

    It doesn't look like you're measuring factorization performance? OpenBLAS matrix-matrix multiplication is fine, it just falls apart when going to things like Cholesky and LU.

    > not the default, I've now checked

    Whatever the Julia default build is doing, so probably not the recursive LAPACK routines then if that's how it's being built. If there's a better default that's worth an issue.

    > That said, I don't understand why people avoid AMD's BLAS/LAPACK

    There just isn't a BLIS wrapper into Julia right now, and it's much easier to just write new BLAS tools than to build wrappers IMO. It makes it very easy to customize to nonstandard Julia number types too. But I do think that BLIS is a great project and I would like to see it replace OpenBLAS as the default. There's been some discussion to make it as easy as MKL (https://github.com/JuliaLinearAlgebra/BLIS.jl/issues/3).

  13. MPI.jl

    MPI wrappers for Julia

  14. CUDA.jl

    CUDA programming in Julia.

  15. rr

    Record and Replay Framework

    It depends a bit on what you want, but https://rr-project.org/ works really well with julia

  16. GPUCompiler.jl

    Reusable compiler infrastructure for Julia GPU backends.

  17. Pkg.jl

    Pkg - Package manager for the Julia programming language

  18. SuiteSparse.jl

    Discontinued Development of SuiteSparse.jl, which ships as part of the Julia standard library.

  19. SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • Composability in Julia: Implementing Deep Equilibrium Models via Neural Odes

    2 projects | news.ycombinator.com | 21 Oct 2021
  • Modelica

    11 projects | news.ycombinator.com | 16 Dec 2024
  • Building a compile-time SIMD optimized smoothing filter

    4 projects | news.ycombinator.com | 28 Sep 2024
  • Good linear algebra libraries

    1 project | /r/Julia | 19 May 2023
  • Julia 1.9: A New Era of Performance and Flexibility

    3 projects | /r/Julia | 14 May 2023

Did you know that Julia is
the 44th most popular programming language
based on number of references?