manylinux
python-manylinux-demo
manylinux | python-manylinux-demo | |
---|---|---|
13 | 1 | |
1,355 | 219 | |
1.8% | 0.0% | |
8.8 | 0.0 | |
4 days ago | about 3 years ago | |
Shell | C | |
MIT License | Creative Commons Zero v1.0 Universal |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
manylinux
-
Building a go program with an older glibc
I use manylinux containers as the OS for compilation. It tries to ensure as much cross-os / libc / etc.. as much as possible for precompiled libraries. https://github.com/pypa/manylinux
-
Alpine Linux in the Browser
Just to clarify for anyone who isn't aware, the "compiling issues", at least historically, have been that that Alpine uses musl, and PyPI's manylinux wheels are built against old glibc versions. So stuff like numpy that would trivially and quickly install from whl on glibc distros (like a bare-bones Ubuntu image) trigger compilations and the installation of build-only dependencies on Alpine.
That said, it looks like as of late-2021, at least some projects are offering musllinux wheels as well, per the discussion here: https://github.com/pypa/manylinux/issues/37 (not numpy, though: https://pypi.org/project/numpy/#files)
-
Because cross-compiling binaries for Windows is easier than building natively
It's very hard. Incompatible glibc ABIs make this nigh impossible, there's a reason Steam installs a vcredistributable.dll for pretty much every game on Windows.
Look no further than the hoops you need jump through to distribute a Linux binary on PyPI [1]. Despite tons of engineering effort, and tons of hoop jumping from packagers, getting a non-trivial binary to run across all distros is still considered functionally impossible.
[1]: https://github.com/pypa/manylinux
- manylinux_2_28 image is published
- manylinux_2_28 image is published (including docker environment)
-
CPython, C standards, and IEEE 754
As a user, if you build every python package from source, it's ok. But if you a maintainer of an OSS project and you need to publish binary packages for it, then you will hit the trouble. Binaries built on Ubuntu 20.04 can only support Ubuntu 20.04 and newer. So you'd better to choose an older Linux release to target broader users. Now most python packages choose CentOS 6 or 7. See https://github.com/pypa/manylinux/issues/1012 for more details. They need help!
-
Using Zig as Cross Platform C Toolchain
I recently learned that Clang supports this kind of cross-compiling out of the box. https://mcilloni.ovh/2021/02/09/cxx-cross-clang/
The main difference is that Clang does not ship with headers/libraries for different platforms, as Zig appears to do. You need to give Clang a "sysroot" -- a path that has the headers/libraries for the platform you want to compile for.
If you create a bunch of sysroots for various architectures, you can do some pretty "easy" cross-compiling with just a single compiler binary. Docker can be a nice way of packaging up these sysroots (especially combined with Docker images like manylinux: https://github.com/pypa/manylinux). Gone are the days when you had to build a separate GCC cross-compiler for each platform you want to target.
- “LLVM-Libc” C Standard Library
-
'Python: Please stop screwing over Linux distros'
Now you come and use manylinux to build. (https://github.com/pypa/manylinux) so you are based on the CentOS 7 toolchain (at best if you use manylinux2014) or Debian 9 toolchain (if you use manylinux_2_24).
-
Building Outer Wonders for Linux
I think the generally accepted way to do that would be a container image running a relatively old distribution. This is exactly what python packages do when they need to distribute binary packages on linux [0]. You are supposed to compile the package in a container (or VM) that runs CentOS 7 (or older if you want broader support), although now the baseline is moving gradually to Debian 9.
[0]: https://github.com/pypa/manylinux
python-manylinux-demo
-
Is there an in-depth description of packaging python dependencies?
I was using manylinux as it was suggested by its example and it gathered the libraries via auditwheel and included them into the package. Up to my knowledge, there is no way to exclude libraries from the package, because it cannot have external dependencies apart from the predefined list of libraries. The linux_*.whl tags cannot be used on PyPI for binary packages.
What are some alternatives?
auditwheel - Auditing and relabeling cross-distribution Linux wheels.
musl-cross-make - Simple makefile-based build for musl cross compiler
glibc_version_header - Build portable Linux binaries without using an ancient distro
mxe - MXE (M cross environment)
lhelper - A simple utility to helps compile and install C/C++ libraries on Windows and Linux
SDL - Simple Directmedia Layer
padio - Zero pad numeric filenames
llvm-mingw - An LLVM/Clang/LLD based mingw-w64 toolchain
project-azua - Data Efficient Decision Making
mach - zig game engine & graphics toolkit
Poetry - Python packaging and dependency management made easy