manylinux
lhelper
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
manylinux
-
Building a go program with an older glibc
I use manylinux containers as the OS for compilation. It tries to ensure as much cross-os / libc / etc.. as much as possible for precompiled libraries. https://github.com/pypa/manylinux
-
Alpine Linux in the Browser
Just to clarify for anyone who isn't aware, the "compiling issues", at least historically, have been that that Alpine uses musl, and PyPI's manylinux wheels are built against old glibc versions. So stuff like numpy that would trivially and quickly install from whl on glibc distros (like a bare-bones Ubuntu image) trigger compilations and the installation of build-only dependencies on Alpine.
That said, it looks like as of late-2021, at least some projects are offering musllinux wheels as well, per the discussion here: https://github.com/pypa/manylinux/issues/37 (not numpy, though: https://pypi.org/project/numpy/#files)
-
Because cross-compiling binaries for Windows is easier than building natively
It's very hard. Incompatible glibc ABIs make this nigh impossible, there's a reason Steam installs a vcredistributable.dll for pretty much every game on Windows.
Look no further than the hoops you need jump through to distribute a Linux binary on PyPI [1]. Despite tons of engineering effort, and tons of hoop jumping from packagers, getting a non-trivial binary to run across all distros is still considered functionally impossible.
[1]: https://github.com/pypa/manylinux
- manylinux_2_28 image is published
- manylinux_2_28 image is published (including docker environment)
-
CPython, C standards, and IEEE 754
As a user, if you build every python package from source, it's ok. But if you a maintainer of an OSS project and you need to publish binary packages for it, then you will hit the trouble. Binaries built on Ubuntu 20.04 can only support Ubuntu 20.04 and newer. So you'd better to choose an older Linux release to target broader users. Now most python packages choose CentOS 6 or 7. See https://github.com/pypa/manylinux/issues/1012 for more details. They need help!
-
Using Zig as Cross Platform C Toolchain
I recently learned that Clang supports this kind of cross-compiling out of the box. https://mcilloni.ovh/2021/02/09/cxx-cross-clang/
The main difference is that Clang does not ship with headers/libraries for different platforms, as Zig appears to do. You need to give Clang a "sysroot" -- a path that has the headers/libraries for the platform you want to compile for.
If you create a bunch of sysroots for various architectures, you can do some pretty "easy" cross-compiling with just a single compiler binary. Docker can be a nice way of packaging up these sysroots (especially combined with Docker images like manylinux: https://github.com/pypa/manylinux). Gone are the days when you had to build a separate GCC cross-compiler for each platform you want to target.
- “LLVM-Libc” C Standard Library
-
'Python: Please stop screwing over Linux distros'
Now you come and use manylinux to build. (https://github.com/pypa/manylinux) so you are based on the CentOS 7 toolchain (at best if you use manylinux2014) or Debian 9 toolchain (if you use manylinux_2_24).
-
Building Outer Wonders for Linux
I think the generally accepted way to do that would be a container image running a relatively old distribution. This is exactly what python packages do when they need to distribute binary packages on linux [0]. You are supposed to compile the package in a container (or VM) that runs CentOS 7 (or older if you want broader support), although now the baseline is moving gradually to Debian 9.
[0]: https://github.com/pypa/manylinux
lhelper
-
C++ Package Managers: The Ultimate Roundup
I have made my own tool to solve this problem:
https://github.com/franko/lhelper
It is currently used by nobody except me but it serves me well.
Its strong points are:
- it works on macOS, Linux and Windows
-
Building Outer Wonders for Linux
I agree, for their use case they should have linked to SDL2 as a static library. It was the best option instead of patching the binary with patchelf.
I think many developers don't know about static libraries. For some reason they think they have to use a shared libraries. This is probably due to many tutorial saying something like: download the SDL2 library from there, it is provided as a shared library, here the instructions to use it with Visual Studio.
The way I do is with static libraries, I ship a single executable, all the third-party libraries are linked in as static libraries. The only dynamic libraries the exectuable will use are the standard libraries that are part of the OS.
I do this very easily with the help for lhelper:
https://github.com/franko/lhelper (I am the author)
that has recipes to build many libraries including SDL2. It builds the library on you system using your compiler and your settings. By default it will build static libraries so you don't have to bother distributing additional dynamic libraries.
What are some alternatives?
auditwheel - Auditing and relabeling cross-distribution Linux wheels.
dockcross - Cross compiling toolchains in Docker images
musl-cross-make - Simple makefile-based build for musl cross compiler
itch - 🎮 The best way to play your itch.io games
glibc_version_header - Build portable Linux binaries without using an ancient distro
SDL - Simple Directmedia Layer
mxe - MXE (M cross environment)
OpenRCT2 - An open source re-implementation of RollerCoaster Tycoon 2 🎢
hexchat - GTK+ IRC client
padio - Zero pad numeric filenames
the-bread-code - Learn how to master the art of baking the programmer way.