manylinux
BinaryBuilder.jl
manylinux | BinaryBuilder.jl | |
---|---|---|
13 | 5 | |
1,355 | 379 | |
1.8% | 1.1% | |
8.8 | 6.5 | |
4 days ago | 4 days ago | |
Shell | Julia | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
manylinux
-
Building a go program with an older glibc
I use manylinux containers as the OS for compilation. It tries to ensure as much cross-os / libc / etc.. as much as possible for precompiled libraries. https://github.com/pypa/manylinux
-
Alpine Linux in the Browser
Just to clarify for anyone who isn't aware, the "compiling issues", at least historically, have been that that Alpine uses musl, and PyPI's manylinux wheels are built against old glibc versions. So stuff like numpy that would trivially and quickly install from whl on glibc distros (like a bare-bones Ubuntu image) trigger compilations and the installation of build-only dependencies on Alpine.
That said, it looks like as of late-2021, at least some projects are offering musllinux wheels as well, per the discussion here: https://github.com/pypa/manylinux/issues/37 (not numpy, though: https://pypi.org/project/numpy/#files)
-
Because cross-compiling binaries for Windows is easier than building natively
It's very hard. Incompatible glibc ABIs make this nigh impossible, there's a reason Steam installs a vcredistributable.dll for pretty much every game on Windows.
Look no further than the hoops you need jump through to distribute a Linux binary on PyPI [1]. Despite tons of engineering effort, and tons of hoop jumping from packagers, getting a non-trivial binary to run across all distros is still considered functionally impossible.
[1]: https://github.com/pypa/manylinux
- manylinux_2_28 image is published
- manylinux_2_28 image is published (including docker environment)
-
CPython, C standards, and IEEE 754
As a user, if you build every python package from source, it's ok. But if you a maintainer of an OSS project and you need to publish binary packages for it, then you will hit the trouble. Binaries built on Ubuntu 20.04 can only support Ubuntu 20.04 and newer. So you'd better to choose an older Linux release to target broader users. Now most python packages choose CentOS 6 or 7. See https://github.com/pypa/manylinux/issues/1012 for more details. They need help!
-
Using Zig as Cross Platform C Toolchain
I recently learned that Clang supports this kind of cross-compiling out of the box. https://mcilloni.ovh/2021/02/09/cxx-cross-clang/
The main difference is that Clang does not ship with headers/libraries for different platforms, as Zig appears to do. You need to give Clang a "sysroot" -- a path that has the headers/libraries for the platform you want to compile for.
If you create a bunch of sysroots for various architectures, you can do some pretty "easy" cross-compiling with just a single compiler binary. Docker can be a nice way of packaging up these sysroots (especially combined with Docker images like manylinux: https://github.com/pypa/manylinux). Gone are the days when you had to build a separate GCC cross-compiler for each platform you want to target.
- “LLVM-Libc” C Standard Library
-
'Python: Please stop screwing over Linux distros'
Now you come and use manylinux to build. (https://github.com/pypa/manylinux) so you are based on the CentOS 7 toolchain (at best if you use manylinux2014) or Debian 9 toolchain (if you use manylinux_2_24).
-
Building Outer Wonders for Linux
I think the generally accepted way to do that would be a container image running a relatively old distribution. This is exactly what python packages do when they need to distribute binary packages on linux [0]. You are supposed to compile the package in a container (or VM) that runs CentOS 7 (or older if you want broader support), although now the baseline is moving gradually to Debian 9.
[0]: https://github.com/pypa/manylinux
BinaryBuilder.jl
-
Is Julia suitable today as a scripting language?
There are some efforts and the startup times are getting better with every release and there's BinaryBuilder.jl.
-
Because cross-compiling binaries for Windows is easier than building natively
There is the Julia package https://github.com/JuliaPackaging/BinaryBuilder.jl which creates an environment that fakes being another, but with the correct compilers and SDKs . It's used to build all the binary dependencies
-
Discussion Thread
https://binarybuilder.org/. You can do it manually obviously, but this is easier.
-
PyTorch: Where we are headed and why it looks a lot like Julia (but not exactly)
> The main pain point is probably the lack of standard, multi-environment packaging solutions for natively compiled code.
Are you talking about something like BinaryBuilder.jl[1], which provides native binaries as julia-callable wrappers?
--
[1] https://binarybuilder.org
-
What to do about GPU packages on PyPI?
Julia did that for binary dependencies for a few years, with adapters for several linux platforms, homebrew, and for cross-compiled RPMs for Windows. It worked, to a degree -- less well on Windows -- but the combinatorial complexity led to many hiccups and significant maintenance effort. Each Julia package had to account for the peculiarities of each dependency across a range of dependency versions and packaging practices (linkage policies, bundling policies, naming variations, distro versions) -- and this is easier in Julia than in (C)Python because shared libraries are accessed via locally-JIT'd FFI, so there is no need to eg compile extensions for 4 different CPython ABIs (Julia also has syntactic macros which can be helpful here).
To provide a better experience for both package authors and users, as well as reducing the maintenance burden, the community has developed and migrated to a unified system called BinaryBuilder (https://binarybuilder.org) over the past 2-3 years. BinaryBuilder allows targeting all supported platforms with a single build script and also "audits" build products for common compatibility and linkage snafus (similar to some of the conda-build tooling and auditwheel). There was a nice talk at AlpineConf recently (https://alpinelinux.org/conf/) covering some of this history and detailing BinaryBuilder, although I'm not sure how to link into the video.
All that to say: it can work to an extent, but it has been tried various times before. The fact that conda and manylinux don't use system packages was not borne out of inexperience, either. The idea of "make binaries a distro packager's problem" sounds like a simplifying step, but that doesn't necessarily work out.
What are some alternatives?
auditwheel - Auditing and relabeling cross-distribution Linux wheels.
functorch - functorch is JAX-like composable function transforms for PyTorch.
musl-cross-make - Simple makefile-based build for musl cross compiler
Yggdrasil - Collection of builder repositories for BinaryBuilder.jl
glibc_version_header - Build portable Linux binaries without using an ancient distro
HTTP.jl - HTTP for Julia
mxe - MXE (M cross environment)
dh-virtualenv - Python virtualenvs in Debian packages
lhelper - A simple utility to helps compile and install C/C++ libraries on Windows and Linux
RDKit - The official sources for the RDKit library
SDL - Simple Directmedia Layer
StarWarsArrays.jl - Arrays indexed as the order of Star Wars movies