encoding VS go_serialization_benchmarks

Compare encoding vs go_serialization_benchmarks and see what are their differences.

encoding

Go package containing implementations of efficient encoding, decoding, and validation APIs. (by segmentio)

go_serialization_benchmarks

Benchmarks of Go serialization methods (by alecthomas)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
encoding go_serialization_benchmarks
8 8
962 1,527
0.7% -
3.6 4.4
5 months ago 8 days ago
Go Go
MIT License -
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

encoding

Posts with mentions or reviews of encoding. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-07.
  • Handling high-traffic HTTP requests with JSON payloads
    5 projects | /r/golang | 7 Dec 2023
  • Rust vs. Go in 2023
    9 projects | news.ycombinator.com | 13 Aug 2023
    https://github.com/BurntSushi/rebar#summary-of-search-time-b...

    Further, Go refusing to have macros means that many libraries use reflection instead, which often makes those parts of the Go program perform no better than Python and in some cases worse. Rust can just generate all of that at compile time with macros, and optimize them with LLVM like any other code. Some Go libraries go to enormous lengths to reduce reflection overhead, but that's hard to justify for most things, and hard to maintain even once done. The legendary https://github.com/segmentio/encoding seems to be abandoned now and progress on Go JSON in general seems to have died with https://github.com/go-json-experiment/json .

    Many people claiming their projects are IO-bound are just assuming that's the case because most of the time is spent in their input reader. If they actually measured they'd see it's not even saturating a 100Mbps link, let alone 1-100Gbps, so by definition it is not IO-bound. Even if they didn't need more throughput than that, they still could have put those cycles to better use or at worst saved energy. Isn't that what people like to say about Go vs Python, that Go saves energy? Sure, but it still burns a lot more energy than it would if it had macros.

    Rust can use state-of-the-art memory allocators like mimalloc, while Go is still stuck on an old fork of tcmalloc, and not just tcmalloc in its original C, but transpiled to Go so it optimizes much less than LLVM would optimize it. (Many people benchmarking them forget to even try substitute allocators in Rust, so they're actually underestimating just how much faster Rust is)

    Finally, even Go Generics have failed to improve performance, and in many cases can make it unimaginably worse through -- I kid you not -- global lock contention hidden behind innocent type assertion syntax: https://planetscale.com/blog/generics-can-make-your-go-code-...

    It's not even close. There are many reasons Go is a lot slower than Rust and many of them are likely to remain forever. Most of them have not seen meaningful progress in a decade or more. The GC has improved, which is great, but that's not even a factor on the Rust side.

  • Quickly checking that a string belongs to a small set
    7 projects | news.ycombinator.com | 30 Dec 2022
    We took a similar approach in our JSON decoder. We needed to support sets (JSON object keys) that aren't necessarily known until runtime, and strings that are up to 16 bytes in length.

    We got better performance with a linear scan and SIMD matching than with a hash table or a perfect hashing scheme.

    See https://github.com/segmentio/asm/pull/57 (AMD64) and https://github.com/segmentio/asm/pull/65 (ARM64). Here's how it's used in the JSON decoder: https://github.com/segmentio/encoding/pull/101

  • 80x improvements in caching by moving from JSON to gob
    6 projects | /r/golang | 11 Apr 2022
    Binary formats work well for some cases but JSON is often unavoidable since it is so widely used for APIs. However, you can make it faster in golang with this https://github.com/segmentio/encoding.
  • Speeding up Go's builtin JSON encoder up to 55% for large arrays of objects
    2 projects | news.ycombinator.com | 3 Mar 2022
    Would love to see results from incorporating https://github.com/segmentio/encoding/tree/master/json!
  • Fastest JSON parser for large (~888kB) API response?
    2 projects | /r/golang | 7 Jan 2022
    Try this one out https://github.com/segmentio/encoding it's always worked well for me
  • 📖 Go Fiber by Examples: Delving into built-in functions
    4 projects | dev.to | 24 Aug 2021
    Converts any interface or string to JSON using the segmentio/encoding package. Also, the JSON method sets the content header to application/json.
  • In-memory caching solutions
    4 projects | /r/golang | 1 Feb 2021
    If you're interested in super fast & easy JSON for that cache give this a try I've used it in prod & never had a problem.

go_serialization_benchmarks

Posts with mentions or reviews of go_serialization_benchmarks. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-04.
  • Rob Pike: Gobs of data (2011)
    10 projects | news.ycombinator.com | 4 Dec 2023
    Someone made a benchmark of serialization libraries in go [1], and I was surprised to see gobs is one of the slowest ones, specially for decoding. I suspect part of the reason is that the API doesn't not allow reusing decoders [2]. From my explorations it seems like both JSON [3], message-pack [4] and CBOR [5] are better alternatives.

    By the way, in Go there are a like a million JSON encoders because a lot of things in the std library are not really coded for maximum performance but more for easy of usage, it seems. Perhaps this is the right balance for certain things (ex: the http library, see [6]).

    There are also a bunch of libraries that allow you to modify a JSON file "in place", without having to fully deserialize into structs (ex: GJSON/SJSON [7] [8]). This sounds very convenient and more efficient that fully de/serializing if we just need to change the data a little.

    --

    1: https://github.com/alecthomas/go_serialization_benchmarks

    2: https://github.com/golang/go/issues/29766#issuecomment-45492...

    --

    3: https://github.com/goccy/go-json

    4: https://github.com/vmihailenco/msgpack

    5: https://github.com/fxamacker/cbor

    --

    6: https://github.com/valyala/fasthttp#faq

    --

    7: https://github.com/tidwall/gjson

    8: https://github.com/tidwall/sjson

  • Introducing Tempo: low latency, cross-platform, end-to-end typesafe APIs
    12 projects | /r/programming | 2 May 2023
    The bebop definition specifies fixed-width types inside a struct. The format of structs cannot be changed, but there are efficiency gains by omitting all of the indices and header data. It's useless as the root message, but it's small and fast for a benchmark.
  • mus-go - the fastest Golang serializer today
    4 projects | /r/golang | 2 May 2023
    Hey everyone! Let me introduce you to mus-go - the fastest Golang serializer today. If you look at benchmarks (https://github.com/alecthomas/go_serialization_benchmarks), you can see that it could be almost twice as fast as its closest "competitor":
  • What is the fastest way to encode the arbitrary struct into bytes?
    4 projects | /r/golang | 2 Mar 2023
    This might be of interest: https://github.com/alecthomas/go_serialization_benchmarks
  • 80x improvements in caching by moving from JSON to gob
    6 projects | /r/golang | 11 Apr 2022
  • gRPC Is Easy to Misconfigure
    1 project | news.ycombinator.com | 17 Mar 2021
    The protobuf vs msgpack benchmarks are not too bad. Msgpack performs very decently.

    https://github.com/alecthomas/go_serialization_benchmarks

  • Bebop encoding in Go
    1 project | /r/golang | 23 Dec 2020
    Maybe submit a PR against https://github.com/alecthomas/go_serialization_benchmarks? That covers a ton of serialization formats already, so adding your library would be cool and avoid wheel reinvention.

What are some alternatives?

When comparing encoding and go_serialization_benchmarks you can also consider the following projects:

sonic - A blazingly fast JSON serializing & deserializing library

bebop - bebop wire format in Go

groupcache - Clone of golang/groupcache with TTL and Item Removal support

bebop - 🎷No ceremony, just code. Blazing fast, typesafe binary serialization.

parquet-go - Go library to read/write Parquet files

msgpack - MessagePack is an extremely efficient object serialization library. It's like JSON, but very fast and small.

base64 - Faster base64 encoding for Go

grpc-web - gRPC for Web Clients

buntdb - BuntDB is an embeddable, in-memory key/value database for Go with custom indexing and geospatial support

msgp - A Go code generator for MessagePack / msgpack.org[Go]

hilbert - Go package for mapping values to and from space-filling curves, such as Hilbert and Peano curves.

go-codec-bench - Benchmark of go binary and text encodings