PackageCompiler.jl VS julia

Compare PackageCompiler.jl vs julia and see what are their differences.

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
PackageCompiler.jl julia
26 350
1,373 44,469
1.4% 0.8%
7.8 10.0
14 days ago 5 days ago
Julia Julia
MIT License MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

PackageCompiler.jl

Posts with mentions or reviews of PackageCompiler.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-04.
  • Potential of the Julia programming language for high energy physics computing
    10 projects | news.ycombinator.com | 4 Dec 2023
    Yes, julia can be called from other languages rather easily, Julia functions can be exposed and called with a C-like ABI [1], and then there's also various packages for languages like Python [2] or R [3] to call Julia code.

    With PackageCompiler.jl [4] you can even make AOT compiled standalone binaries, though these are rather large. They've shrunk a fair amount in recent releases, but they're still a lot of low hanging fruit to make the compiled binaries smaller, and some manual work you can do like removing LLVM and filtering stdlibs when they're not needed.

    Work is also happening on a more stable / mature system that acts like StaticCompiler.jl [5] except provided by the base language and people who are more experienced in the compiler (i.e. not a janky prototype)

    [1] https://docs.julialang.org/en/v1/manual/embedding/

    [2] https://pypi.org/project/juliacall/

    [3] https://www.rdocumentation.org/packages/JuliaCall/

    [4] https://github.com/JuliaLang/PackageCompiler.jl

    [5] https://github.com/tshort/StaticCompiler.jl

  • Strong arrows: a new approach to gradual typing
    1 project | news.ycombinator.com | 21 Sep 2023
  • Making Python 100x faster with less than 100 lines of Rust
    21 projects | news.ycombinator.com | 29 Mar 2023
    One of Julia's Achilles heels is standalone, ahead-of-time compilation. Technically this is already possible [1], [2], but there are quite a few limitations when doing this (e.g. "Hello world" is 150 MB [7]) and it's not an easy or natural process.

    The immature AoT capabilities are a huge pain to deal with when writing large code packages or even when trying to make command line applications. Things have to be recompiled each time the Julia runtime is shut down. The current strategy in the community to get around this seems to be "keep the REPL alive as long as possible" [3][4][5][6], but this isn't a viable option for all use cases.

    Until Julia has better AoT compilation support, it's going to be very difficult to develop large scale programs with it. Version 1.9 has better support for caching compiled code, but I really wish there were better options for AoT compiling small, static, standalone executables and libraries.

    [1]: https://julialang.github.io/PackageCompiler.jl/dev/

  • What's Julia's biggest weakness?
    7 projects | /r/Julia | 18 Mar 2023
    Doesn’t work on Windows, but https://github.com/JuliaLang/PackageCompiler.jl does.
  • I learned 7 programming languages so you don't have to
    8 projects | news.ycombinator.com | 12 Feb 2023
    Also, you can precompile a whole package and just ship the binary. We do this all of the time.

    https://github.com/JuliaLang/PackageCompiler.jl

    And getting things precompiled: https://sciml.ai/news/2022/09/21/compile_time/

  • Julia performance, startup.jl, and sysimages
    3 projects | /r/Julia | 19 Nov 2022
    You can have a look at PackageCompiler.jl
  • Why Julia 2.0 isn’t coming anytime soon (and why that is a good thing)
    1 project | news.ycombinator.com | 12 Sep 2022
    I think by PackageManager here you mean package compiler, and yes these improvements do not need a 2.0. v1.8 included a few things to in the near future allow for building binaries without big dependencies like LLVM, and finishing this work is indeed slated for the v1.x releases. Saying "we are not doing a 2.0" is precisely saying that this is more important than things which change the user-facing language semantics.

    And TTFP does need to be addressed. It's a current shortcoming of the compiler that native and LLVM code is not cached during the precompilation stages. If such code is able to precompile into binaries, then startup time would be dramatically decreased because then a lot of package code would no longer have to JIT compile. Tim Holy and Valentin Churavy gave a nice talk at JuliaCon 2022 about the current progress of making this work: https://www.youtube.com/watch?v=GnsONc9DYg0 .

    This is all tied up with startup time and are all in some sense the same issue. Currently, the only way to get LLVM code cached, and thus startup time essentially eliminated, is to build it into what's called the "system image". That system image is the binary that package compiler builds (https://github.com/JuliaLang/PackageCompiler.jl). Julia then ships with a default system image that includes the standard library in order to remove the major chunk of code that "most" libraries share, which is why all of Julia Base works without JIT lag. However, that means everyone wants to have their thing, be it sparse matrices to statistics, in the standard library so that it gets the JIT-lag free build by default. This means the system image is huge, which is why PackageCompiler, which is simply a system for building binaries by appending package code to the system image, builds big binaries. What needs to happen is for packages to be able to precompile in a way that then caches LLVM and native code. Then there's no major compile time advantage to being in the system image, which will allow things to be pulled out of the system image to have a leaner Julia Base build without major drawbacks, which would then help make the system compile. That will then make it so that an LLVM and BLAS build does not have to be in every binary (which is what takes up most of the space and RAM), which would then allow Julia to much more comfortably move beyond the niche of scientific computing.

  • Is it possible to create a Python package with Julia and publish it on PyPi?
    6 projects | /r/Julia | 23 Apr 2022
  • GenieFramework – Web Development with Julia
    4 projects | news.ycombinator.com | 6 Apr 2022
  • Julia for health physics/radiation detection
    3 projects | /r/Julia | 9 Mar 2022
    You're probably dancing around the edges of what [PackageCompiler.jl](https://github.com/JuliaLang/PackageCompiler.jl) is capable of targeting. There are a few new capabilities coming online, namely [separating codegen from runtime](https://github.com/JuliaLang/julia/pull/41936) and [compiling small static binaries](https://github.com/tshort/StaticCompiler.jl), but you're likely to hit some snags on the bleeding edge.

julia

Posts with mentions or reviews of julia. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-03-06.
  • Top Paying Programming Technologies 2024
    19 projects | dev.to | 6 Mar 2024
    34. Julia - $74,963
  • Optimize sgemm on RISC-V platform
    6 projects | news.ycombinator.com | 28 Feb 2024
    I don't believe there is any official documentation on this, but https://github.com/JuliaLang/julia/pull/49430 for example added prefetching to the marking phase of a GC which saw speedups on x86, but not on M1.
  • Dart 3.3
    2 projects | news.ycombinator.com | 15 Feb 2024
    3. dispatch on all the arguments

    the first solution is clean, but people really like dispatch.

    the second makes calling functions in the function call syntax weird, because the first argument is privileged semantically but not syntactically.

    the third makes calling functions in the method call syntax weird because the first argument is privileged syntactically but not semantically.

    the closest things to this i can think of off the top of my head in remotely popular programming languages are: nim, lisp dialects, and julia.

    nim navigates the dispatch conundrum by providing different ways to define free functions for different dispatch-ness. the tutorial gives a good overview: https://nim-lang.org/docs/tut2.html

    lisps of course lack UFCS.

    see here for a discussion on the lack of UFCS in julia: https://github.com/JuliaLang/julia/issues/31779

    so to sum up the answer to the original question: because it's only obvious how to make it nice and tidy like you're wanting if you sacrifice function dispatch, which is ubiquitous for good reason!

  • Julia 1.10 Highlights
    1 project | news.ycombinator.com | 27 Dec 2023
    https://github.com/JuliaLang/julia/blob/release-1.10/NEWS.md
  • Best Programming languages for Data Analysis📊
    4 projects | dev.to | 7 Dec 2023
    Visit official site: https://julialang.org/
  • Potential of the Julia programming language for high energy physics computing
    10 projects | news.ycombinator.com | 4 Dec 2023
    No. It runs natively on ARM.

    julia> versioninfo() Julia Version 1.9.3 Commit bed2cd540a1 (2023-08-24 14:43 UTC) Build Info: Official https://julialang.org/ release

  • Rust std:fs slower than Python
    7 projects | news.ycombinator.com | 29 Nov 2023
    https://github.com/JuliaLang/julia/issues/51086#issuecomment...

    So while this "fixes" the issue, it'll introduce a confusing time delay between you freeing the memory and you observing that in `htop`.

    But according to https://jemalloc.net/jemalloc.3.html you can set `opt.muzzy_decay_ms = 0` to remove the delay.

    Still, the musl author has some reservations against making `jemalloc` the default:

    https://www.openwall.com/lists/musl/2018/04/23/2

    > It's got serious bloat problems, problems with undermining ASLR, and is optimized pretty much only for being as fast as possible without caring how much memory you use.

    With the above-mentioned tunables, this should be mitigated to some extent, but the general "theme" (focusing on e.g. performance vs memory usage) will likely still mean "it's a tradeoff" or "it's no tradeoff, but only if you set tunables to what you need".

  • Eleven strategies for making reproducible research the norm
    1 project | news.ycombinator.com | 25 Nov 2023
    I have asked about Julia's reproducibility story on the Guix mailing list in the past, and at the time Simon Tournier didn't think it was promising. I seem to recall Julia itself didnt have a reproducible build. All I know now is that github issue is still not closed.

    https://github.com/JuliaLang/julia/issues/34753

  • Julia as a unifying end-to-end workflow language on the Frontier exascale system
    5 projects | news.ycombinator.com | 19 Nov 2023
    I don't really know what kind of rebuttal you're looking for, but I will link my HN comments from when this was first posted for some thoughts: https://news.ycombinator.com/item?id=31396861#31398796. As I said, in the linked post, I'm quite skeptical of the business of trying to assess relative buginess of programming in different systems, because that has strong dependencies on what you consider core vs packages and what exactly you're trying to do.

    However, bugs in general suck and we've been thinking a fair bit about what additional tooling the language could provide to help people avoid the classes of bugs that Yuri encountered in the post.

    The biggest class of problems in the blog post, is that it's pretty clear that `@inbounds` (and I will extend this to `@assume_effects`, even though that wasn't around when Yuri wrote his post) is problematic, because it's too hard to write. My proposal for what to do instead is at https://github.com/JuliaLang/julia/pull/50641.

    Another common theme is that while Julia is great at composition, it's not clear what's expected to work and what isn't, because the interfaces are informal and not checked. This is a hard design problem, because it's quite close to the reasons why Julia works well. My current thoughts on that are here: https://github.com/Keno/InterfaceSpecs.jl but there's other proposals also.

  • Getaddrinfo() on glibc calls getenv(), oh boy
    10 projects | news.ycombinator.com | 16 Oct 2023
    Doesn't musl have the same issue? https://github.com/JuliaLang/julia/issues/34726#issuecomment...

    I also wonder about OSX's libc. Newer versions seem to have some sort of locking https://github.com/apple-open-source-mirror/Libc/blob/master...

    but older versions (from 10.9) don't have any lockign: https://github.com/apple-oss-distributions/Libc/blob/Libc-99...

What are some alternatives?

When comparing PackageCompiler.jl and julia you can also consider the following projects:

StaticCompiler.jl - Compiles Julia code to a standalone library (experimental)

jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

Genie.jl - 🧞The highly productive Julia web framework

NetworkX - Network Analysis in Python

LuaJIT - Mirror of the LuaJIT git repository

Lua - Lua is a powerful, efficient, lightweight, embeddable scripting language. It supports procedural programming, object-oriented programming, functional programming, data-driven programming, and data description.

Dash.jl - Dash for Julia - A Julia interface to the Dash ecosystem for creating analytic web applications in Julia. No JavaScript required.

rust-numpy - PyO3-based Rust bindings of the NumPy C-API

Transformers.jl - Julia Implementation of Transformer models

Numba - NumPy aware dynamic Python compiler using LLVM

ModelingToolkit.jl - An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations

F# - Please file issues or pull requests here: https://github.com/dotnet/fsharp