Protobuf
Apache Parquet
Protobuf | Apache Parquet | |
---|---|---|
177 | 4 | |
64,219 | 2,465 | |
0.9% | 2.4% | |
10.0 | 9.3 | |
1 day ago | 3 days ago | |
C++ | Java | |
GNU General Public License v3.0 or later | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Protobuf
-
A protoc compiler plugin that generates useful extension code for Kotlin/JVM
I have raised an issue requesting the addition of optional scalar types, but it is not planned to be supported by protoc-gen-kotlin.
- Show HN: Protobuf Editions now available in v27.0
-
Consistent Hashing: An Overview and Implementation in Golang
protobuf: go get -u google.golang.org/protobuf/proto
-
Hitting every branch on the way down
It's because they changed the versioning format: https://github.com/protocolbuffers/protobuf/releases?page=5
But I suppose old version still receive bugfixes.
-
Reverse Engineering Protobuf Definitions from Compiled Binaries
For at least 4 years protobuf has had decent support for self-describing messages (very similar to avro) as well as reflection
https://github.com/protocolbuffers/protobuf/blob/main/src/go...
Xgooglers trying to make do on the cheap will just create a Union of all their messages and include the message def in a self-describing message pattern. Super-sensitive network I/O can elide the message def (empty buffer) and any for RecordIO clone well file compression takes care of the definition.
Definitely useful to be able to dig out old defs but protobuf maintainers have surprisingly added useful features so you don’t have to.
Bonus points tho for extracting the protobuf defs that e.g. Apple bakes into their binaries.
- Show HN: AuthWin – Authenticator App for Windows
-
Create Production-Ready SDKs With gRPC Gateway
gRPC Gateway is a protoc plugin that reads gRPC service definitions and generates a reverse proxy server that translates a RESTful JSON API into gRPC.
-
Create Production-Ready SDKs with Goa
To use more recent versions of protoc in future applications, you can download them from the Protobuf repository.
-
Roll your own auth with Rust and Protobuf
Use the Protobuf CLI protoc and the plugin protoc-gen-tonic.
-
Add extra stuff to a “standard” encoding? Sure, why not
> didn’t find any standard for separating protobuf messages
The fact that protobufs are not self-delimiting is an endless source of frustration, but I know of 2 standards:
- SerializeDelimited* is part of the protobuf library: https://github.com/protocolbuffers/protobuf/blob/main/src/go...
- Riegeli is "a file format for storing a sequence of string records, typically serialized protocol buffers. It supports dense compression, fast decoding, seeking, detection and optional skipping of data corruption, filtering of proto message fields for even faster decoding, and parallel encoding": https://github.com/google/riegeli
Apache Parquet
-
How-to-Guide: Contributing to Open Source
Apache Parquet
-
parquet-tools
This go implementation, other than common advantages from go itself (small single executable, support multiple platforms, speed, etc.), has some neat features compare with Java parquet tool and Python one like:
-
Writing Apache Parquet Files
Hi, I've been trying to write parquet files on android for the past couple of days, and have really been struggling to find a solution. My original hypothesis was to just use the java parquet implementation (https://github.com/apache/parquet-mr), but I've since realized that not all java libraries play well with Android. I've gone through essentially dependency hell trying to franken-fit the library into my project, and imported as much as i could before hitting walls such as this one (https://github.com/mockito/mockito/issues/841).
-
pqrs: A parquet-tools replacement in Rust using Apache Arrow
Like many of you probably do, I tend to work with Parquet files a lot. parquet-tools has been my tool of choice for inspecting parquet files, but that has been deprecated recently. So, I created a replacement for it using Rust and Apache Arrow.
What are some alternatives?
FlatBuffers - FlatBuffers: Memory Efficient Serialization Library
Apache Thrift - Apache Thrift
SBE - Simple Binary Encoding (SBE) - High Performance Message Codec
Apache Avro - Apache Avro is a data serialization system.
MessagePack - MessagePack implementation for C and C++ / msgpack.org[C/C++]
Apache Orc - Apache ORC - the smallest, fastest columnar storage for Hadoop workloads
cereal - A C++11 library for serialization
Big Queue - A big, fast and persistent queue based on memory mapped file.
Bond - Bond is a cross-platform framework for working with schematized data. It supports cross-language de/serialization and powerful generic mechanisms for efficiently manipulating data. Bond is broadly used at Microsoft in high scale services.
Persistent Collection - A Persistent Java Collections Library
Protobuf.NET - Protocol Buffers library for idiomatic .NET
Wire - gRPC and protocol buffers for Android, Kotlin, Swift and Java.