Julia Linear Algebra

Open-source Julia projects categorized as Linear Algebra

Top 5 Julia Linear Algebra Projects

  • Measurements.jl

    Error propagation calculator and library for physical measurements. It supports real and complex numbers with uncertainty, arbitrary precision calculations, operations with arrays, and numerical integration.

  • Grassmann.jl

    ⟨Grassmann-Clifford-Hodge⟩ multilinear differential geometric algebra

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
  • ArnoldiMethod.jl

    The Arnoldi Method with Krylov-Schur restart, natively in Julia.

  • Project mention: What Is a Schur Decomposition? | news.ycombinator.com | 2024-03-04

    For large and matrices you can use the (restarted) Arnoldi method to compute a partial Schur decomposition AQ=QR where Q is tall and skinny and R has a few dominant eigenvalues on the diagonal (i.e. eigenvalues on the boundary of the convex hull).

    MATLAB uses ARPACK's implementation of this when you call `eigs`

    I wrote my own implementation ArnoldiMethod.jl in julia, which unlike MATLAB/ARPACK supports arbitrary number types, and also should be more stable in general, and equally fast.

    [1] https://github.com/JuliaLinearAlgebra/ArnoldiMethod.jl

  • Project mention: Can Fortran survive another 15 years? | news.ycombinator.com | 2023-05-01

    What about the other benchmarks on the same site? https://docs.sciml.ai/SciMLBenchmarksOutput/stable/Bio/BCR/ BCR takes about a hundred seconds and is pretty indicative of systems biological models, coming from 1122 ODEs with 24388 terms that describe a stiff chemical reaction network modeling the BCR signaling network from Barua et al. Or the discrete diffusion models https://docs.sciml.ai/SciMLBenchmarksOutput/stable/Jumps/Dif... which are the justification behind the claims in https://www.biorxiv.org/content/10.1101/2022.07.30.502135v1 that the O(1) scaling methods scale better than O(log n) scaling for large enough models? I mean.

    > If you use special routines (BLAS/LAPACK, ...), use them everywhere as the respective community does.

    It tests with and with BLAS/LAPACK (which isn't always helpful, which of course you'd see from the benchmarks if you read them). One of the key differences of course though is that there are some pure Julia tools like https://github.com/JuliaLinearAlgebra/RecursiveFactorization... which outperform the respective OpenBLAS/MKL equivalent in many scenarios, and that's one noted factor for the performance boost (and is not trivial to wrap into the interface of the other solvers, so it's not done). There are other benchmarks showing that it's not apples to apples and is instead conservative in many cases, for example https://github.com/SciML/SciPyDiffEq.jl#measuring-overhead showing the SciPyDiffEq handling with the Julia JIT optimizations gives a lower overhead than direct SciPy+Numba, so we use the lower overhead numbers in https://docs.sciml.ai/SciMLBenchmarksOutput/stable/MultiLang....

    > you must compile/write whole programs in each of the respective languages to enable full compiler/interpreter optimizations

    You do realize that a .so has lower overhead to call from a JIT compiled language than from a static compiled language like C because you can optimize away some of the bindings at the runtime right? https://github.com/dyu/ffi-overhead is a measurement of that, and you see LuaJIT and Julia as faster than C and Fortran here. This shouldn't be surprising because it's pretty clear how that works?

    I mean yes, someone can always ask for more benchmarks, but now we have a site that's auto updating tons and tons of ODE benchmarks with ODE systems ranging from size 2 to the thousands, with as many things as we can wrap in as many scenarios as we can wrap. And we don't even "win" all of our benchmarks because unlike for you, these benchmarks aren't for winning but for tracking development (somehow for Hacker News folks they ignore the utility part and go straight to language wars...).

    If you have a concrete change you think can improve the benchmarks, then please share it at https://github.com/SciML/SciMLBenchmarks.jl. We'll be happy to make and maintain another.

  • JOLI.jl

    Julia Operators LIbrary

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The open source projects on this list are ordered by number of github stars. The number of mentions indicates repo mentiontions in the last 12 Months or since we started tracking (Dec 2020).

Julia Linear Algebra related posts

Index

What are some of the best open-source Linear Algebra projects in Julia? This list will help you:

Project Stars
1 Measurements.jl 470
2 Grassmann.jl 449
3 ArnoldiMethod.jl 93
4 RecursiveFactorization.jl 74
5 JOLI.jl 17

Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com