gittup
sccache
Our great sponsors
gittup | sccache | |
---|---|---|
1 | 70 | |
57 | 5,319 | |
- | 2.5% | |
10.0 | 9.5 | |
almost 3 years ago | 10 days ago | |
HTML | Rust | |
- | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gittup
-
Ccache – a fast C/C++ compiler cache
I wrote a little bit about this a couple of years ago: https://blog.williammanley.net/2020/05/25/unlock-software-fr...
gittup[1] implements part of this idea, but without the distributed bit. bazel[2] implements the distributed bit (minus trust), but not the distro bit. What's really lacking is momentum around the idea to get a sufficient number of people behind it.
sccache
-
Mozilla sccache: cache with cloud storage
Worth noting that the first commit in sccache git repository was in 2014 (https://github.com/mozilla/sccache/commit/115016e0a83b290dc2...). So I suppose that what "happened" happened waay back.
- Welcome to Apache OpenDAL
-
Target file are very huge and running out of storage on mac.
If you have lots of shared dependencies, maybe try sccache?
-
S3 Express Is All You Need
I'm going to set up sccache [0] to use it tomorrow. We use MSVC, so EFS is off the cards.
- sccache
-
Serde has started shipping precompiled binaries with no way to opt out
I think the primary benefit of pre-built procmacros will be for build servers which don't use a persistent cache (like sccache), since they have to compile all dependencies every time. But IMO improved support for persistent caches would be a better investment compared to adding support for pre-built procmacros.
-
Cache dependencies across crates
Checkout https://github.com/mozilla/sccache
-
Distcc: A fast, free distributed C/C++ compiler
https://github.com/mozilla/sccache is another option which addresses the use cases of both icecream and ccache (and also supports Rust, and cloud storage of artifacts, if those are useful for you)
-
How to fix Rust Coding LARGE files????
That being said a compilation cache, eg the de-facto standard for Rust: sccache (https://github.com/mozilla/sccache) will help to compile and store some of the build artifacts centralized - still for each crate version + build profile (RUSTFLAGS) combination.
-
On the verge of giving up learning Haskell because of the terrible tooling.
That's definitely not my experience. Never had any issue running Rust on Windows. You just download and run rustup-init.exe, then updating is simply a matter of rustup update. Documentation generation is built in (cargo doc) and just a case of annotating code with triple-/ markdown comments and then running that command. sccache works fine for me (just need to set RUSTC_WRAPPER=/path/to/sccache). And the error messages from rustc are by far the best of any compiler I've used. Not sure how they're unhelpful, they tend to explain step-by-step what the problem is and how to fix it.
What are some alternatives?
poudriere - Port/Package build and test system
ccache - ccache – a fast compiler cache
icecream - Distributed compiler with a central scheduler to share build load
cargo-chef - A cargo-subcommand to speed up Rust Docker builds using Docker layer caching.
ZQuestClassic - ZQuest Classic is a game engine for creating games similar to the original NES Zelda
rust-cache - A GitHub Action that implements smart caching for rust/cargo projects
confcache
cache - Cache dependencies and build outputs in GitHub Actions
mold - Mold: A Modern Linker 🦠
fluvio - Lean and mean distributed stream processing system written in rust and web assembly.
gdnative - Rust bindings for Godot 3