-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
sdk
Core functionality needed to create .NET Core projects, that is shared between Visual Studio and CLI (by dotnet)
-
runtimelab
This repo is for experimentation and exploring new ideas that may or may not make it into the main dotnet/runtime repo.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
KeenWrite
Discontinued Free, open-source, cross-platform desktop Markdown text editor with live preview, string interpolation, and math.
Well, at least that's that easy if what you try to compile don't have C dependencies. For C dependencies, there is cross <https://github.com/japaric/rust-cross> which I had good experiences with.
Using CMake is like trying to do one-handed pushups to setup your build. Use https://mesonbuild.com/ and make your life comfortable as a couch potato.
This is a little out of date; the .NET single-file situation is pretty good as of .NET 5 (which came out in 2020).
These days, if you publish a single-file app all .NET libraries live in the executable and do not need to be unpacked at runtime. Native libraries still need to get unpacked at runtime, but you just set the IncludeNativeLibrariesForSelfExtract build property and everything else happens automatically for you.
I'm working on getting IncludeNativeLibrariesForSelfExtract set by default, upvote here if you think that's a good idea: https://github.com/dotnet/sdk/issues/24181
> There's something nice about the ergonomics of a single file compared to a single folder though
Agreed. We are going to move to proper single file (or as close as we can get) as soon as any sort of meaningful AOT compilation is available:
https://github.com/dotnet/runtimelab/tree/feature/NativeAOT
https://github.com/dotnet/runtimelab/issues/248
There is a "PublishSingleFile" option, but that is just a zip file in disguise.
> By cross-compiling one usually understands compiling for the same OS but different architecture.
I don't even consider that to rise to the level of "cross compiling".
Getting started with emscripten to target WASM for C and C++ is rather a chore of dependency wrangling IME. Targeting WASM from Rust, OTOH, is trivial. Targeting windows from linux with Rust is also quite straightforward, as has been experimenting with targeting consoles or Android from Windows.
Targeting a MIPS32 OpenDingux target from Windows was much more of a chore. The toolchain with libs, headers, etc. that I used is just a *.tar.bz2 that expects to be extracted to /opt/gcw0-toolchain of a linux distro specifically, and embedded absolute paths all over the place make changing that difficult. I do resort to WSL on Windows, basically only because of those embedded paths: https://github.com/MaulingMonkey/rust-opendingux-test
Acquiring the appropriate libs and headers to link/compile against for cross compiling is always an adventure, but Rust isn't making things any worse IME.
As I see it, part of the drive behind tools like Scoop is to overcome the limitations of the binary-shipping strategy common to Windows developers. They are successful at this, I agree, but only partially successful. They come from the tradition of programs like Ninite, which were explicitly built as ways to make the binary approach suck less than it did before.
I see the success of these programs as essentially stemming from the insertion of user interests in the form of a maintainer-like process. Sure, they're still working with the binaries, but the actual process of installing and managing these binaries is controlled by users, for users: https://github.com/ScoopInstaller/Main/tree/master/bucket
This means that you get moderation and in many cases modification to the behavior of the program. In a freeware environment like Windows that's full of shitware, at the very least you can in many cases strip out the ads. That's absolutely not nothing, but at the end of the day it comes from a group of user-maintainers stepping up and saying to developers that no, you cannot simply do whatever you want on my system with your software. That's ... sort of the whole point of a software distribution, in the Linux world!
When I want the latest version of a CLI tool on Linux, I simply `pacman -S package`. That's it; one command. I don't see how it could be any simpler or better than that, and on top of that I'm getting the benefits of moderation and integration with the rest of my system. Perhaps you are emphasizing latest version here, and hinting that you don't get that on Linux distros? That depends entirely on the distro; a software distribution is (roughly) a collection of user interests. An Arch user wants (and gets) the latest versions of all upstream software. A Debian user does not want this or see constant updating to the latest version as an advantage, so that's not what they get.
"Java -- possible, but you'll need a startup script to just call java -jar some-tool.jar; (also not a good fit for short-lived tools, mainly due to startup times;)"
Two technologies to look at:
* Warp Packer -- https://github.com/dgiagio/warp/
* Liberica Native Image Kit -- https://bell-sw.com/pages/liberica-native-image-kit/
Warp Packer bundles my JavaFX desktop application, KeenWrite into single binary executable files:
* https://github.com/DaveJarvis/keenwrite/releases/download/2.... (Linux)
* https://github.com/DaveJarvis/keenwrite/releases/download/2.... (Windows)
* https://github.com/DaveJarvis/keenwrite/releases/download/2....
The start-up time for the first launch of the .bin or .exe is slow because it unpacks to a user directory. Subsequent starts are fine enough, even when running from the command-line as a short-lived task. Here's the script that creates the self-contained executable files:
https://github.com/DaveJarvis/keenwrite/blob/master/installe...
To create a release for all three files, I run a single shell script from a build machine:
https://github.com/DaveJarvis/keenwrite/blob/master/release....
I could probably generate a binary for MacOS, but not enough people have asked.
"Java -- possible, but you'll need a startup script to just call java -jar some-tool.jar; (also not a good fit for short-lived tools, mainly due to startup times;)"
Two technologies to look at:
* Warp Packer -- https://github.com/dgiagio/warp/
* Liberica Native Image Kit -- https://bell-sw.com/pages/liberica-native-image-kit/
Warp Packer bundles my JavaFX desktop application, KeenWrite into single binary executable files:
* https://github.com/DaveJarvis/keenwrite/releases/download/2.... (Linux)
* https://github.com/DaveJarvis/keenwrite/releases/download/2.... (Windows)
* https://github.com/DaveJarvis/keenwrite/releases/download/2....
The start-up time for the first launch of the .bin or .exe is slow because it unpacks to a user directory. Subsequent starts are fine enough, even when running from the command-line as a short-lived task. Here's the script that creates the self-contained executable files:
https://github.com/DaveJarvis/keenwrite/blob/master/installe...
To create a release for all three files, I run a single shell script from a build machine:
https://github.com/DaveJarvis/keenwrite/blob/master/release....
I could probably generate a binary for MacOS, but not enough people have asked.
Good point!
I'll add to that READMEs, LICENSEs, SBOMs (Software Bill of Materials), example configuration files, etc. How to supply all those files when all one gets is a single binary executable?
Simple! Bundle everything in the executable.
As a bonus, because the tool outputs these files, it can now generated them dynamically. For example instead of a bland configuration file, with all the possible integrations commented out, it could either try to auto-detect where it's running and what's available, or present the user with a question-answer session to fill in the details.
----
For example, a pet project of mine <https://github.com/volution/z-run>:
z-run --readme # shows the README with `less` (if on TTY) or to `stdout`
z-run --readme-html # for the HTML version to be opened in `lynx`
z-run --manual # or --manual-man or --manual-html
z-run --sbom # or --sbom-json or --sbom-html
It even gives you the source code:
z-run --sources-cpio | cpio -t
So, does your tool need a `.desktop` file? Just create a flag for that.
Or, if there are too many such extra files needed to be placed wherever provide an `--extras-cpio` and dump them as an archive, or if placing them requires some work, provide an `--extras-install`, but before `sudo`, kindly ask the user for permission.
Granted all this requires some extra work, and increases the bulkiness of the executable, but:
* all that extra code can be extracted into a common library; (I intend to do that for my software;)
* if all these are compressed, especially being text-only, they are a fraction of the final executable;
----
I am especially proud of the `--sources-cpio` option. Is something broken with a particular version of the tool that you rely on? Great, instead of bumbling around GitHub to find the particular commit that was used to build this particular version, I can just get the sources from my tool and use those. All I need is the build tools, which in case of Go is another `.tar.gz`.