LZ4
zstd
Our great sponsors
- InfluxDB - Access the most powerful time series database as a service
- SonarLint - Clean code begins in your IDE with SonarLint
- CodiumAI - TestGPT | Generating meaningful tests for busy devs
- ONLYOFFICE ONLYOFFICE Docs — document collaboration in your environment
LZ4 | zstd | |
---|---|---|
21 | 96 | |
8,294 | 20,277 | |
2.6% | 2.1% | |
8.7 | 9.7 | |
7 days ago | 5 days ago | |
C | C | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
LZ4
-
Rsyncing 20TB locally
According to these https://github.com/lz4/lz4 values you need around ten (10) quite modern cores in parallel to accomplish around 8GB/s.
- Cerbios Xbox Bios V2.2.0 BETA Released (1.0 - 1.6)
-
zstd
> The downside of lz4 is that it can’t be configured to run at higher & slower compression ratios.
lz4 has some level of configurability? https://github.com/lz4/lz4/blob/v1.9.4/lib/lz4frame.h#L194
There's also LZ4_HC.
-
I'm new to this
Get your bootloader unlocked via Download mode and then obtain your stock firmware, preferably for your current region https://samfw.com (Download mode: CARRIER_CODE). Get the boot image from AP with 7zip, unpack from LZ4 with https://github.com/lz4/lz4/releases (drag and drop), patch with Magisk https://github.com/topjohnwu/magisk/releases/latest, grab the new image, name it "boot.img" and pack it into a .tar with 7zip and flash to AP with odin https://odindownload.com
-
An efficient image format for SDL
After some investigations and experiments, I found out that it was the PNG compression (well, decompression I should say) that took a while. So I've made some experiments using the LZ4 compression library, which is focused on decompression speed, and it turned out to be an excellent solution!
-
Bzip3 – a better and stronger spiritual successor to bzip2
If anyone just cares for speed instead of compression I’d recommend lz4 [1]. I only recently started using it. Its speed is almost comparable to memcpy.
-
I just took a random screenshot and made it look prettier. [ I don't know if this counts as fanart ]
E: Realtime compression (A good compression library like Zstandard can make a game less than half the size while taking a tiny amount of CPU power when loading stuff. I think thats a pretty worthwhile trade.) (ZSTD github) (LZ4 github)
-
What's the best way to compress strings?
lz4 for maximum decompression speed, for data that is often read but rarely written
-
How to become a tools/graphics/engine programmer
Getting lost in material models is tempting. But, at this point you are overdue for working on your own asset pipeline. glTF is great. But, you should learn how to do it yourself. The hardest part will be reading source asset files. The FBX SDK is painful. Assimp isn't great either. Writing your own exporter to your own intermediate text format from Maya or Blender would be good if you are up for it. From whatever source, make your own archive format and binary formats for meshes, animations, textures and scenes. Use https://github.com/lz4/lz4 for compression. You should be able to decompress a list of assets into a big linear array and use them right there with just a bit of pointer fix-up. Minimize the amount of memory you have to touch from start to finish. Data that is going to the GPU (textures, vertex/index buffers) should decompress straight into mapped buffers for fast uploads.
-
LZ4, an Extremely Fast Compression Algorithm
I'm not a fan of the stacked bar charts, I like the table of data for "Benchmarks" on the github source page: https://github.com/lz4/lz4
It makes it very clear where LZ4 fits into comparisons with compression speed, decompression speed and compression ratio
zstd
-
How in the world should we unpack archive.org zst files on Windows?
If you want this functionality in zstd itself, check this out: https://github.com/facebook/zstd/pull/2349
- ZSTD 1.5.5 is released with a corruption fix found at Google
-
Float Compression 3: Filters
Interesting to match with the observations from the practice of using ClickHouse[1][2] for time series:
1. Reordering to SOA helps a lot - this is the whole point of column-oriented databases.
2. Specialized codecs like Gorilla[3], DoubleDelta[4], and FPC[5] lose to simply using ZSTD[6] compression in most cases, both in compression ratio and in performance.
3. Specialized time-series DBMS like InfluxDB or TimescaleDB lose to general-purpose relational OLAP DBMS like ClickHouse [7][8][9].
[1] https://clickhouse.com/blog/optimize-clickhouse-codecs-compr...
[2] https://github.com/ClickHouse/ClickHouse
[3] https://clickhouse.com/docs/en/sql-reference/statements/crea...
[4] https://clickhouse.com/docs/en/sql-reference/statements/crea...
[5] https://clickhouse.com/docs/en/sql-reference/statements/crea...
[6] https://github.com/facebook/zstd/
[7] https://arxiv.org/pdf/2204.09795.pdf "SciTS: A Benchmark for Time-Series Databases in Scientific Experiments and Industrial Internet of Things" (2022)
[8] https://gitlab.com/gitlab-org/incubation-engineering/apm/apm... https://gitlab.com/gitlab-org/incubation-engineering/apm/apm...
[9] https://www.sciencedirect.com/science/article/pii/S187705091...
-
We're wasting money by only supporting gzip for raw DNA files
zstd has a long range mode, which lets it find redundancies a gigabyte away. Try --long and --long=31 for very long range mode.
zstd has delta / patch mode, which creates a file that stores the "patch" to create a new file from an old (reference) file. See https://github.com/facebook/zstd/wiki/Zstandard-as-a-patchin...
See the man page: https://github.com/facebook/zstd/blob/dev/programs/zstd.1.md
-
Decompressing the ZST files on Windows tips
So I downloaded the Facebook tool https://github.com/facebook/zstd
-
zstd
They have a nice table on that page: https://github.com/facebook/zstd#benchmarks
Looking at that table, I think LZ4 is a winner. The compression ratio is not too far, compression speed is slightly faster, decompression speed is significantly faster, the code is much simpler so the compiled binary is smaller, and the project is unrelated to Facebook.
-
The checklist: Monitoring for Economy
In some cases, zstd can offer up to a 30% reduction in compressed storage as compared to other compression mechanisms. Learn more about zstd here.
What are some alternatives?
Snappy - A fast compressor/decompressor
LZMA - (Unofficial) Git mirror of LZMA SDK releases
7-Zip-zstd - 7-Zip with support for Brotli, Fast-LZMA2, Lizard, LZ4, LZ5 and Zstandard
ZLib - A massively spiffy yet delicately unobtrusive compression library.
brotli - Brotli compression format
LZFSE - LZFSE compression library and command line tool
zlib - Cloudflare fork of zlib with massive performance improvements
haproxy - HAProxy Load Balancer's development branch (mirror of git.haproxy.org)