gogen-avro
sarama
gogen-avro | sarama | |
---|---|---|
2 | 20 | |
360 | 10,115 | |
- | - | |
4.2 | 8.6 | |
3 months ago | 10 months ago | |
Go | Go | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gogen-avro
-
NoProto: Flexible, Fast and Compact Serialization with RPC
This seems pretty confused. The "compiled vs dynamic" distinction is a property of the implementation, not of the protocol.
For example, you can certainly compile Avro into Go source files [0]. You can even compile Avro loaded schemas _during runtime_ into Python bytecode, since Python is interpreted [1]. This even works if you have the _wrong schema document_ for the message (you'll just get the subset of fields which are accurately described), because of Avro's schema compatibility rules.
Likewise, you can deserialize arbitrary protobuf messages during runtime without a compilation step, if you have a description for the message schema. The Python protobuf library has had a "ParseMessage" API forever, and protoreflect [2] exists for Go. (In case it's not obvious, I mostly work in Python and Go but I am completely certain analogues exist in other major languages).
There is a very big and important difference between a protocol and the implementation of a protocol. I think this README's author is not clear on that difference, which shows up in other claims ("Deserialization is incrimental", for example) too.
---
[0] https://github.com/actgardner/gogen-avro
[1] https://github.com/spenczar/avroc
[2] https://pkg.go.dev/google.golang.org/protobuf/reflect/protor...
-
Feature complete Kafka client written in Go
at my company, we ended up writing a client on top of the Confluent client (which hooks into the librdkafka C library) and uses the gogen-avro library to provide type safe codegen for Avro schemas and is used for the serde process within the client. the API ended up looking basically like the standard Java API. as for feature parity, librdkafka seems to provide most of what we've needed from a Kafka producer/consumer config side of things while the SR implementation is limited in scope anyways and can easily be included in a basic library.
sarama
-
Ingesting Data into OpenSearch using Apache Kafka and Go
Note: Since there are multiple Go clients for Kafka (including Sarama), please make sure to consult their client documentation to confirm whether they support IAM authentication.
-
book about golang and kafka
You might want to gradually replace that one with https://github.com/twmb/franz-go because Shopify is looking to find a new owner for Sarama and, until or if they do, it seems to be falling behind with maintenance: https://github.com/Shopify/sarama/issues/2461 For example, they still haven’t addressed this breaking change https://github.com/Shopify/sarama/issues/2358. franz-go has worked well so far in Benthos https://github.com/benthosdev/benthos/tree/main/internal/impl/kafka and it will likely end up as the only implementation once the Sarama-based one will be deprecated
-
Klient - a native, statically-compiled, command line client for Kafka
I've used mainly sarama wrapped with a bit of bespoke helper libraries. Never really looked into others, just grabbed one that was actively maintained and went for it.
-
Golang bad design reference
Well, as someone who reviews a lot of code, I don't like seeing 100s of little files when a handful of logically grouped files would do. For example, this popular go project: https://github.com/Shopify/sarama is currently 256 small .go files, largely following a one class/file rule.
-
Concurrency in Go is hard
The first example is something we ran into while working on a project. Up until recently, the sarama library (Go library for Apache Kafka) contained the following piece of code (at sarama/version.go):
-
AWS MSK with go sarama
Im using the go sarama library to connect to the cluster, using this basic example.
-
Benthos - Fancy stream processing made operationally mundane
If you find the kafka input slow, try kafka_franz. It might be a bit faster, since it’s based on https://github.com/twmb/franz-go. The kafka one is based on https://github.com/Shopify/sarama. You can also write a custom input based on https://github.com/confluentinc/confluent-kafka-go, but this library relies on CGo, which can be annoying.
- Sarama - Go library for Apache Kafka.
- Understanding Kafka with Factorio
-
Is segmentio/kafka-go production ready ?
There are a few factors that are stopping me from using kafka-go over Shopify's sarama.
What are some alternatives?
kowl - Redpanda Console is a developer-friendly UI for managing your Kafka/Redpanda workloads. Console gives you a simple, interactive approach for gaining visibility into your topics, masking data, managing consumer groups, and exploring real-time data with time-travel debugging. [Moved to: https://github.com/redpanda-data/console]
Confluent Kafka Golang Client - Confluent's Apache Kafka Golang client
avroc - Python library for compiling Avro schemas into executable encoders/decoders
kafka-go - Kafka library in Go
NoProto - Flexible, Fast & Compact Serialization with RPC
franz-go - franz-go contains a feature complete, pure Go library for interacting with Kafka from 0.8.0 through 3.6+. Producing, consuming, transacting, administrating, etc.
librdkafka - The Apache Kafka C/C++ library
gorush - A push notification server written in Go (Golang).
Mercure - 🪽 An open, easy, fast, reliable and battery-efficient solution for real-time communications
gopush-cluster - Golang push server cluster
go-events - :mega: Pure nodejs EventEmmiter for the Go Programming Language.
Benthos - Fancy stream processing made operationally mundane