Pkg.jl VS Octavian.jl

Compare Pkg.jl vs Octavian.jl and see what are their differences.

Pkg.jl

Pkg - Package manager for the Julia programming language (by JuliaLang)

Octavian.jl

Multi-threaded BLAS-like library that provides pure Julia matrix multiplication (by JuliaLinearAlgebra)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
Pkg.jl Octavian.jl
5 17
603 222
1.0% 0.0%
9.0 3.9
3 days ago 28 days ago
Julia Julia
GNU General Public License v3.0 or later GNU General Public License v3.0 or later
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Pkg.jl

Posts with mentions or reviews of Pkg.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-05-10.
  • Julia 1.9 Highlights
    9 projects | news.ycombinator.com | 10 May 2023
    There was a "bug" (or just unhandled caching case) that effected the Pluto notebook system that required precompilation each time. This is because Pluto notebooks kept a manifest (so they always instantiated with the same packages every time for full reproducibility) and the instantiation of that manifest triggered not just package running but also precompilation. That was fixed in https://github.com/JuliaLang/Pkg.jl/pull/3378, with a larger discussion in https://discourse.julialang.org/t/first-pluto-notebook-launc.... That should largely remove this issue as in included in the v1.9 release (it was first in v1.9-RC2 IIRC).
  • Unable to load PDMats package.
    1 project | /r/Julia | 1 Jul 2022
    The closest thing I got to is this and I don't even understand what they are saying.
  • Why Fortran is easy to learn
    19 projects | news.ycombinator.com | 7 Jan 2022
    Julia's compiler is made to be extendable. GPUCompiler.jl which adds the .ptx compilation output for example is a package (https://github.com/JuliaGPU/GPUCompiler.jl). The package manager of Julia itself... is an external package (https://github.com/JuliaLang/Pkg.jl). The built in SuiteSparse usage? That's a package too (https://github.com/JuliaLang/SuiteSparse.jl). It's fairly arbitrary what is "external" and "internal" in a language that allows that kind of extendability. Literally the only thing that makes these packages a standard library is that they are built into and shipped with the standard system image. Do you want to make your own distribution of Julia that changes what the "internal" packages are? Here's a tutorial that shows how to add plotting to the system image (https://julialang.github.io/PackageCompiler.jl/dev/examples/...). You could setup a binary server for that and now the first time to plot is 0.4 seconds.

    Julia's arrays system is built so that most arrays that are used are not the simple Base.Array. Instead Julia has an AbstractArray interface definition (https://docs.julialang.org/en/v1/manual/interfaces/#man-inte...) which the Base.Array conforms to, and many effectively standard library packages like StaticArrays.jl, OffsetArrays.jl, etc. conform to, and thus they can be used in any other Julia package, like the differential equation solvers, solving nonlinear systems, optimization libraries, etc. There is a higher chance that packages depend on these packages then that they do not. They are only not part of the Julia distribution because the core idea is to move everything possible out to packages. There's not only a plan to make SuiteSparse and sparse matrix support be a package in 2.0, but also ideas about making the rest of linear algebra and arrays themselves into packages where Julia just defines memory buffer intrinsic (with likely the Arrays.jl package still shipped with the default image). At that point, are arrays not built into the language? I can understand using such a narrow definition for systems like Fortran or C where the standard library is essentially a fixed concept, but that just does not make sense with Julia. It's inherently fuzzy.

  • MlJ.jl: A Julia Machine Learning Framework
    4 projects | news.ycombinator.com | 11 Apr 2021
    This is exacerbated by the fact that Julia's Pkg.jl does not yet support conditional/optional dependencies [0]. A lot of these meta packages tend to pull everything but the kitchen sink.

    [0]: https://github.com/JuliaLang/Pkg.jl/issues/1285

  • Adding packages in Julia extremely painful
    1 project | /r/Julia | 29 Dec 2020
    The LTS release is over two years old, and Julia has received a lot of developer attention since then, resulting in new features and performance improvements that tutorial authors don't want to do without. You can safely use the latest stable release (v1.5.3), although you may also want to apply the Git registry fix (https://github.com/JuliaLang/Pkg.jl/issues/2014#issuecomment-730676631) for further improvements in download/setup speed.

Octavian.jl

Posts with mentions or reviews of Octavian.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-02-22.
  • Yann Lecun: ML would have advanced if other lang had been adopted versus Python
    9 projects | news.ycombinator.com | 22 Feb 2023
  • Julia 1.8 has been released
    8 projects | news.ycombinator.com | 18 Aug 2022
    For some examples of people porting existing C++ Fortran libraries to julia, you should check out https://github.com/JuliaLinearAlgebra/Octavian.jl, https://github.com/dgleich/GenericArpack.jl, https://github.com/apache/arrow-julia (just off the top of my head). These are all ports of C++ or Fortran libraries that match (or exceed) performance of the original, and in the case of Arrow.jl is faster, more general, and 10x less code.
  • Why Julia matrix multiplication so slow in this test?
    2 projects | /r/Julia | 31 May 2022
    Note that a performance-optimized Julia implementation is on par or even outperform the specialized high-performance BLAS libraries, see https://github.com/JuliaLinearAlgebra/Octavian.jl .
  • Multiple dispatch: Common Lisp vs Julia
    4 projects | /r/Julia | 5 Mar 2022
    If you look at the thread for your first reference, there were a large number of performance improvements suggested that resulted in a 30x speedup when combined. I'm not sure what you're looking at for your second link, but Julia is faster than Lisp in n-body, spectral norm, mandelbrot, pidigits, regex, fasta, k-nucleotide, and reverse compliment benchmarks. (8 out of 10). For Julia going faster than C/Fortran, I would direct you to https://github.com/JuliaLinearAlgebra/Octavian.jl which is a julia program that beats MKL and openblas for matrix multiplication (which is one of the most heavily optimized algorithms in the world).
  • Why Fortran is easy to learn
    19 projects | news.ycombinator.com | 7 Jan 2022
    > But in the end, it's FORTRAN all the way down. Even in Julia.

    That's not true. None of the Julia differential equation solver stack is calling into Fortran anymore. We have our own BLAS tools that outperform OpenBLAS and MKL in the instances we use it for (mostly LU-factorization) and those are all written in pure Julia. See https://github.com/YingboMa/RecursiveFactorization.jl, https://github.com/JuliaSIMD/TriangularSolve.jl, and https://github.com/JuliaLinearAlgebra/Octavian.jl. And this is one part of the DiffEq performance story. The performance of this of course is all validated on https://github.com/SciML/SciMLBenchmarks.jl

  • Show HN: prometeo – a Python-to-C transpiler for high-performance computing
    19 projects | news.ycombinator.com | 17 Nov 2021
    Well IMO it can definitely be rewritten in Julia, and to an easier degree than python since Julia allows hooking into the compiler pipeline at many areas of the stack. It's lispy an built from the ground up for codegen, with libraries like (https://github.com/JuliaSymbolics/Metatheory.jl) that provide high level pattern matching with e-graphs. The question is whether it's worth your time to learn Julia to do so.

    You could also do it at the LLVM level: https://github.com/JuliaComputingOSS/llvm-cbe

    For interesting takes on that, you can see https://github.com/JuliaLinearAlgebra/Octavian.jl which relies on loopvectorization.jl to do transforms on Julia AST beyond what LLVM does. Because of that, Octavian.jl beats openblas on many linalg benchmarks

  • Python behind the scenes #13: the GIL and its effects on Python multithreading
    2 projects | news.ycombinator.com | 29 Sep 2021
    The initial results are that libraries like LoopVectorization can already generate optimal micro-kernels, and is competitive with MKL (for square matrix-matrix multiplication) up to around size 512. With help on macro-kernel side from Octavian, Julia is able to outperform MKL for sizes up to to 1000 or so (and is about 20% slower for bigger sizes). https://github.com/JuliaLinearAlgebra/Octavian.jl.
  • From Julia to Rust
    14 projects | news.ycombinator.com | 5 Jun 2021
    > The biggest reason is because some function of the high level language is incompatible with the application domain. Like garbage collection in hot or real-time code or proprietary compilers for processors. Julia does not solve these problems.

    The presence of garbage collection in julia is not a problem at all for hot, high performance code. There's nothing stopping you from manually managing your memory in julia.

    The easiest way would be to just preallocate your buffers and hold onto them so they don't get collected. Octavian.jl is a BLAS library written in julia that's faster than OpenBLAS and MKL for small matrices and saturates to the same speed for very large matrices [1]. These are some of the hottest loops possible!

    For true, hard-real time, yes julia is not a good choice but it's perfectly fine for soft realtime.

    [1] https://github.com/JuliaLinearAlgebra/Octavian.jl/issues/24#...

  • Julia 1.6 addresses latency issues
    5 projects | news.ycombinator.com | 25 May 2021
    If you want performance benchmarks vs Fortran, https://benchmarks.sciml.ai/html/MultiLanguage/wrapper_packa... has benchmarks with Julia out-performing highly optimized Fortran DiffEq solvers, and https://github.com/JuliaLinearAlgebra/Octavian.jl shows that pure Julia BLAS implementations can compete with MKL and openBLAS, which are among the most heavily optimized pieces of code ever written. Furthermore, Julia has been used on some of the world's fastest super-computers (in the performance critical bits), which as far as I know isn't true of Swift/Kotlin/C#.

    Expressiveness is hard to judge objectively, but in my opinion at least, Multiple Dispatch is a massive win for writing composable, re-usable code, and there really isn't anything that compares on that front to Julia.

  • Octavian.jl – BLAS-like Julia procedures for CPU
    1 project | news.ycombinator.com | 23 May 2021

What are some alternatives?

When comparing Pkg.jl and Octavian.jl you can also consider the following projects:

Pluto.jl - 🎈 Simple reactive notebooks for Julia

OpenBLAS - OpenBLAS is an optimized BLAS library based on GotoBLAS2 1.13 BSD version.

TriangularSolve.jl - rdiv!(::AbstractMatrix, ::UpperTriangular) and ldiv!(::LowerTriangular, ::AbstractMatrix)

Symbolics.jl - Symbolic programming for the next generation of numerical software

maptrace - Produce watertight polygonal vector maps by tracing raster images

owl - Owl - OCaml Scientific Computing @ https://ocaml.xyz

AutoMLPipeline.jl - A package that makes it trivial to create and evaluate machine learning pipeline architectures.

Verilog.jl - Verilog for Julia

parca-demo - A collection of languages and frameworks profiled by Parca and Parca agent

Automa.jl - A julia code generator for regular expressions

Fortran-code-on-GitHub - Directory of Fortran codes on GitHub, arranged by topic

StaticCompiler.jl - Compiles Julia code to a standalone library (experimental)