brotli
haproxy
brotli | haproxy | |
---|---|---|
32 | 18 | |
13,710 | 5,139 | |
0.7% | 2.0% | |
7.4 | 9.9 | |
4 days ago | 9 days ago | |
TypeScript | C | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
brotli
-
A Career Ending Mistake
Projects like Brotli aren't built to maximize personal profit; they're driven by passion and a genuine love for software engineering.
It's clear that the industry is shifting from being geeky and nerdy to being more business and management focused.
[0] https://github.com/google/brotli
-
Building an Efficient Text Compression Algorithm Inspired by Silicon Valley’s Pied Piper
Brotli is a compression algorithm developed by Google, particularly effective for text and web compression. It uses a combination of LZ77 (Lempel-Ziv 77), Huffman coding, and 2nd order context modeling. In comparison to traditional algorithms like Gzip, Brotli can achieve smaller compressed sizes, especially for HTML and text-heavy content. This makes it a good candidate for our Pied Piper-inspired text compression implementation.
-
Compression Dictionary Transport
The one example I can think of with a pre-seeded dictionary (for web, no less) is Brotli.
https://datatracker.ietf.org/doc/html/rfc7932#appendix-A
You can more or less see what it looks like (per an older commit): https://github.com/google/brotli/blob/5692e422da6af1e991f918...
Certainly it performs better than gzip by itself.
Some historical discussion: https://news.ycombinator.com/item?id=19678985
-
WebP: The WebPage Compression Format
I believe the compression dictionary refers to [1], which is used to quickly match dictionary-compressable byte sequences. I don't know where 170 KB comes from, but that hash alone does take 128 KiB and might be significant if it can't be easily recomputed. But I'm sure that it can be quickly computed on the loading time if the binary size is that important.
[1] https://github.com/google/brotli/blob/master/c/enc/dictionar...
-
Current problems and mistakes of web scraping in Python and tricks to solve them!
The answer lies in the Accept-Encoding header. In the example above, I just copied it from my browser, so it lists all the compression methods my browser supports: "gzip, deflate, br, zstd". The Wayfair backend supports compression with "br", which is Brotli, and uses it as the most efficient method.
-
LZW and GIF explained
...though with the slightly unexpected side effect (for Brotli, at least) that your executable may end up containing (~200KB, from memory) of very unexpected plain text strings which might (& has[0]) lead to questions from software end-users asking why your software contains "random"[1] text (including potentially "culturally sensitive" words/phrases related to religion such as "Holy Roman Emperor", "Muslims", "dollars", "emacs"[2] or similar).
(I encountered this aspect while investigating potential size optimization opportunities for the Godot game engine's web/WASM builds--though presumably the Brotli dictionary compresses well if the transfer encoding is... Brotli. :D )
[0] "This needs to be reviewed immediately #876": https://github.com/google/brotli/issues/876
[1] Which, regardless of meaning, certainly bears similarities to the type of "unexpected weird text" commonly/normally associated with spam, malware, LLMs and other entities of ill repute.
[2] The final example may not actually be factual. :)
-
Node.js vs Angular: Navigating the Modern Web Development Landscape
Using tools like Brotli, you can boost your application’s load time. You can use the ngUpgrade library to mix AngularJS and Angular components to enhance runtime performance, bringing in hybrid applications that can be used with techniques like ahead-of-time (AOT) compilation, aiding in faster browser rendering.
-
Jpegli: A New JPEG Coding Library
JPEGLI = A small JPEG
The suffix -li is used in Swiss German dialects. It forms a diminutive of the root word, by adding -li to the end of the root word to convey the smallness of the object and to convey a sense of intimacy or endearment.
This obviously comes out of Google Zürich.
Other notable Google projects using Swiss German:
https://github.com/google/gipfeli high-speed compression
Gipfeli = Croissant
https://github.com/google/guetzli perceptual JPEG encoder
Guetzli = Cookie
https://github.com/weggli-rs/weggli semantic search tool
Weggli = Bread roll
https://github.com/google/brotli lossless compression
Brötli = Small bread
-
Compression efficiency with shared dictionaries in Chrome
The brotli repo on github has a dictionary generator: https://github.com/google/brotli/blob/master/research/dictio...
I have a hosted version of it on https://use-as-dictionary.com/ to make it easier to experiment with.
-
The Full-Stack development experience
An additional element that we can finally remove from our stack is the minification of JavaScript and CSS files. Thanks to algorithms like brotli (with a very Swiss flavour) we no longer need to minify and compress our files before distributing them. Cloudflare, Nginx, or Apache will take care of everything for us.
haproxy
- HAProxy ECH (Encrypted client hello) support #1924 (2022)
-
What We Learned from a Year of Building with LLMs
I totally agree, that's what I had to do with my patchbot that evaluates haproxy patches to be backported ( https://github.com/haproxy/haproxy/tree/master/dev/patchbot/ ). Originally it would just provide a verdict and justify it and it worked extremely poorly, often with a justification that directly contradicted the verdict. I swapped that by asking the analysis and the final verdict and now the success rate is totally amazing (particularly with mistral that remains unbeatable at this task by obeying extremely well to instructions).
-
HAProxy is not affected by the HTTP/2 Rapid Reset Attack (CVE-2023-44487)
I wanted to try it out just now but hit a roadblock immediately - it cannot automatically obtain and maintain TLS certificates. You have to use an external client (e.g. acme.sh), set up a cron to check/renew them, and poke HAProxy to reload them if necessary. I'm way past doing this in 2023.
https://www.haproxy.com/blog/haproxy-and-let-s-encrypt
https://github.com/haproxy/haproxy/issues/1864
-
Why Haproxy is not build with PROMEX by default (Linux / BSD)
For context I think this might be useful: https://github.com/haproxy/haproxy/blob/master/addons/promex/README
-
minexmr2.com updated to p2pool v3.1, monerod v0.18.2.0, and ready for Mar 18 p2pool (not monero) hardfork
I turn on 1 relatively cheap cloud server to process DNS, https and stratum connections and route them via haproxy to one of N miner servers described above.
-
HAProxy Security Update (CVE-2023-25725) - HTTP content smuggling attack
Full technical writeup here: https://github.com/haproxy/haproxy/commit/a8598a2eb11b6c989e81f0dbf10be361782e8d32
- Request smuggling in HAProxy via empty header name
- Enormous session rate
- Update to haproxy 2.4.18 breaks WebDAV
-
HAProxy 2.7
With the recent discussions about memory safe languages, HAProxy is still surprisingly written in C [0].
[0]: https://github.com/haproxy/haproxy
What are some alternatives?
Snappy - A fast compressor/decompressor
zstd - Zstandard - Fast real-time compression algorithm
LZ4 - Extremely Fast Compression algorithm
Jool - SIIT and NAT64 for Linux
ClickHouse - ClickHouse® is a real-time analytics database management system
LZMA - (Unofficial) Git mirror of LZMA SDK releases
3proxy - 3proxy - tiny free proxy server
ZLib - A massively spiffy yet delicately unobtrusive compression library.
Caddy - Fast and extensible multi-platform HTTP/1-2-3 web server with automatic HTTPS
zlib-ng - zlib replacement with optimizations for "next generation" systems.
traefik - The Cloud Native Application Proxy