schema-registry
Protobuf
Our great sponsors
schema-registry | Protobuf | |
---|---|---|
7 | 171 | |
2,136 | 63,586 | |
1.3% | 1.0% | |
10.0 | 10.0 | |
5 days ago | 5 days ago | |
Java | C++ | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
schema-registry
-
JR, quality Random Data from the Command line, part I
So, is JR yet another faking library written in Go? Yes and no. JR indeed implements most of the APIs in fakerjs and Go fake it, but it's also able to stream data directly to stdout, Kafka, Redis and more (Elastic and MongoDB coming). JR can talk directly to Confluent Schema Registry, manage json-schema and Avro schemas, easily maintain coherence and referential integrity. If you need more than what is OOTB in JR, you can also easily pipe your data streams to other cli tools like kcat thanks to its flexibility.
- What tool do you use to document your Kafka messages format?
-
How to handle failing message in a topic with Avro schema?
Check here for more details. https://github.com/confluentinc/schema-registry
-
What is Schema Registry and How Does It Work? [Explained]
Confluent Schema Registry for Apache Kafka [GitHub]
-
Testing a Kafka consumer with Avro schema messages in your Spring Boot application with Testcontainers
So that means we can configure the Kafka producer and consumer with an imaginary schema registry url, that only needs to start with “mock://” and you automatically get to work with the MockSchemaRegistryClient. This way you don't need to explicitly initiate the MockSchemaRegistryClient and configure everything accordingly. That also eradicates the need for the Confluent Schema Registry Container. Running the Kafka Testcontainer with the embedded Zookeeper, we no longer need an extra Zookeeper container and we are down to one Testcontainer for the messaging. This way I ended up with only two Testcontainers: Kafka and the database.
-
confluent Schema Registry and Rust
Confluent is a company founded by the creators of Apache Kafka. They are providing the Confluent Platform which consists of several components, all based on Kafka. The license for these components vary. The Schema Registry has the community-license, which basically means it's free to use as long as you don't offer the Schema Registry itself as a SaaS solution. The source code can be found on Github.
-
An Overview About the Different Kafka Connect Plugins
Schema Registry from Confluent (GitHub) => http://localhost:8081/
Protobuf
-
Reverse Engineering Protobuf Definitions from Compiled Binaries
For at least 4 years protobuf has had decent support for self-describing messages (very similar to avro) as well as reflection
https://github.com/protocolbuffers/protobuf/blob/main/src/go...
Xgooglers trying to make do on the cheap will just create a Union of all their messages and include the message def in a self-describing message pattern. Super-sensitive network I/O can elide the message def (empty buffer) and any for RecordIO clone well file compression takes care of the definition.
Definitely useful to be able to dig out old defs but protobuf maintainers have surprisingly added useful features so you don’t have to.
Bonus points tho for extracting the protobuf defs that e.g. Apple bakes into their binaries.
- Show HN: AuthWin – Authenticator App for Windows
-
Create Production-Ready SDKs With gRPC Gateway
gRPC Gateway is a protoc plugin that reads gRPC service definitions and generates a reverse proxy server that translates a RESTful JSON API into gRPC.
-
Create Production-Ready SDKs with Goa
To use more recent versions of protoc in future applications, you can download them from the Protobuf repository.
-
Roll your own auth with Rust and Protobuf
Use the Protobuf CLI protoc and the plugin protoc-gen-tonic.
-
Add extra stuff to a “standard” encoding? Sure, why not
> didn’t find any standard for separating protobuf messages
The fact that protobufs are not self-delimiting is an endless source of frustration, but I know of 2 standards:
- SerializeDelimited* is part of the protobuf library: https://github.com/protocolbuffers/protobuf/blob/main/src/go...
- Riegeli is "a file format for storing a sequence of string records, typically serialized protocol buffers. It supports dense compression, fast decoding, seeking, detection and optional skipping of data corruption, filtering of proto message fields for even faster decoding, and parallel encoding": https://github.com/google/riegeli
-
Block YouTube Ads on AppleTV by Decrypting and Stripping Ads from Profobuf
It looks like it is in fact universal. Just glancing at the code here, it looks like the tool searches any arbitrary file for bytes that look like encoded protobuf descriptors, specifically looking for bytes that are plausibly the beginning of a FileDescriptorProto message defined here:
https://github.com/protocolbuffers/protobuf/blob/main/src/go...
This takes advantage of the fact that such descriptors are commonly compiled into programs that use protobuf. The descriptors are usually embedded as constant byte arrays. That said, not all protobuf implementations embed the descriptors and those that do often have an option to inhibit such embedding (at the expense of losing some dynamic introspection features).
- How to learn to use protoc in 21 easily infuriating steps
-
What's involved in protobuf encoding?
Not much. You can check the source code in https://github.com/protocolbuffers/protobuf. For example, for serializing a boolean in C#: https://github.com/protocolbuffers/protobuf/blob/main/csharp/src/Google.Protobuf/WritingPrimitives.cs#L165. Strings and objects are a bit more complicated, but it is all about turning the data into its byte representation.
-
Trying To Solve The Confusion of Choice Between gRPC vs REST🕵
One of the key feature of gRPC is protobuf .proto file(nothing but just a contract for me between two communicator code components) This file and protobuff compiler is so mature, then it generates a direct client implementation using protoccompiler. ref
What are some alternatives?
kafka-ui - Open-Source Web UI for Apache Kafka Management
FlatBuffers - FlatBuffers: Memory Efficient Serialization Library
kafdrop - Kafka Web UI
SBE - Simple Binary Encoding (SBE) - High Performance Message Codec
schema-registry-gitops - Manage Confluent Schema Registry subjects through Infrastructure as code
MessagePack - MessagePack implementation for C and C++ / msgpack.org[C/C++]
rust-rdkafka - A fully asynchronous, futures-based Kafka client library for Rust based on librdkafka
cereal - A C++11 library for serialization
kafka-avro-without-registry - Test Spring Kafka application (using Avro as a serialization mechanism) without the need for Confluent Schema Registry
Apache Parquet - Apache Parquet
ksqlDB-GraphQL-poc - A fairly simple setup to show how ksqlDB can be used with GraphQL.
Bond - Bond is a cross-platform framework for working with schematized data. It supports cross-language de/serialization and powerful generic mechanisms for efficiently manipulating data. Bond is broadly used at Microsoft in high scale services.