rules_go | pronto | |
---|---|---|
6 | 4 | |
1,331 | 6 | |
-0.2% | - | |
9.0 | 0.0 | |
9 days ago | over 3 years ago | |
Go | Java | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
rules_go
-
When to Use Bazel?
There’s an issue I reported (along with a proof of concept fix) over 4 years ago, that has yet to be fixed: building a mixed source project containing Go & C++ & C++ protocol buffers results in silently broken binaries as rules_go will happily not forward along the linker arguments that the C++ build targets (the protobuf ones, using the built in C++ rules) declare.
See https://github.com/bazelbuild/rules_go/issues/1486
Not very confidence inspiring when Google’s build system falls over when you combine three technologies that are used commonly throughout Google’s code base (two of which were created by Google).
If you’re Google, sure, use Bazel. Otherwise, I wouldn’t recommend it. Google will cater to their needs and their needs only — putting the code out in the open means you get the privilege of sharing in their tech debt, and if something isn’t working, you can contribute your labor to them for free.
No thanks :)
-
Caculating Go type sets is harder than you think
Bazel in theory maintains its own directory of generated code that your IDE should refer to. Back when I last used Bazel, there was a bug open to make gopls properly understand this ("go packages driver" is the search term). Nobody touched this bug for a couple years, so I gave up.
Here's the bug: https://github.com/bazelbuild/rules_go/issues/512
I basically wouldn't use Bazel with Go. Go already has a build system, Bazel is best for languages that don't ship a build system, like C++.
-
Buf raises $93M to deprecate REST/JSON
`proto_library` for building the `.bin` file from protos works great. Generating stubs/messages for "all" languages does not. Each language does not want to implement gRPC rules, the gRPC team does not want to implement rules for each language. Sort of a deadlock situation. For example:
- C++: https://github.com/grpc/grpc/blob/master/bazel/cc_grpc_libra...
- Python: https://github.com/grpc/grpc/blob/master/bazel/python_rules....
- ObjC: https://github.com/grpc/grpc/blob/master/bazel/objc_grpc_lib...
- Java: https://github.com/grpc/grpc-java/blob/master/java_grpc_libr...
- Go (different semantics than all of the other): https://github.com/bazelbuild/rules_go/blob/master/proto/def...
But there's also no real cohesion within the community. The biggest effort to date has been in https://github.com/stackb/rules_proto which integrates with gazelle.
tl;dr: Low alignment results in diverging implementations that are complicated to understand for newcomers. Buff's approach is much more appealing as it's a "this is the one way to do the right thing" and having it just work by detecting `proto_library` and doing all of the linting/registry stuff automagically in CI would be fantastic.
-
Why does Bazel not get more love?
This can be ugly in some languages. There’s decent go support in VSCode if you follow these copy & paste instructions here https://github.com/bazelbuild/rules_go/wiki/Editor-setup
- GOPACKAGESDRIVER support for Bazel's rules_go, fixes Bazel + gopls
-
What is the preferred way to package static files (html/css/js) along with your standalone binary in 2020?
Bazel go_embed_data
pronto
-
Buf raises $93M to deprecate REST/JSON
5. Message streaming (gRPC streams are amazing)
I can think of a whole host of features that can be built off of protos (I've even built ORMs off of protobuffs for simple things [0]). The value prop is there IMO. HTTP + json APIs are a local minima. The biggest concerns "I want to be able to view the data that is being sent back and forth" is a tooling consideration (curl ... isn't showing you the voltages from the physical layer, it is decoded). Buff is building that tooling.
[0] - https://github.com/CaperAi/pronto
-
Parsing Gigabytes of JSON per Second
I've written translation layers for such systems and it's not too bad. See this project from $job - 1: https://github.com/CaperAi/pronto
It allowed us to have a single model for storage in the DB, for sending between services, and syncing to edge devices.
-
gRPC for Microservices Communication
There's no reason you couldn't use gRPC with json as a serialized message format. For example grpc-gateway [0] provides a very effective way of mapping a gRPC concept to HTTP/JSON. The thing is, after moving to gRPC, I've never really felt a desire to move back to JSON. While it may be correct to say "parsing json is fast enough" it's important to note that there's a "for most use cases" after that. Parsing protos is fast enough for even more use cases. You also get streams which are amazing for APIs where you have to sync some large amounts of data (listing large collections from a DB for example) across two services.
With gRPC you also have a standardized middleware API that is implemented for "all" languages. The concepts cleanly map across multiple languages and types are mostly solved for you.
Adding to that you can easily define some conventions for a proto and make amazing libraries for your team. At a previous job I made this: https://github.com/CaperAi/pronto/
Made it super easy to prototype multiple services as if you mock a service backed by memory we could plop it into a DB with zero effort.
I think this "gRPC vs X" method of thinking isn't appropriate here because protos are more like a Object.prototype in JavaScript. They're a template for what you're sending. If you have the Message you want to send you can serialize that to JSON or read from JSON or XML or another propriety format and automatically get a host of cool features (pretty printing, serialization to text/binary, sending over the network, etc).
[0] - https://github.com/grpc-ecosystem/grpc-gateway
-
We Went All in on Sqlc/Pgx for Postgres and Go
I attempted to make something similar to this except the opposite direction at a previous job. It was called Pronto: https://github.com/CaperAi/pronto/
It allowed us to store and query Protos into MongoDB. It wasn't perfect (lots of issues) but the idea was rather than specifying custom models for all of our DB logic in our Java code we could write a proto and automatically and code could import that proto and read/write it into the database. This made building tooling to debug issues very easy and make it very simple to hide a DB behind a gRPC API.
The tool automated the boring stuff. I wish I could have extended this to have you define a service in a .proto and "compile" that into an ORM DAO-like thing automatically so you never need to worry about manually wiring that stuff ever again.
What are some alternatives?
go-bindata - A small utility which generates Go code from any file. Useful for embedding binary data in a Go program.
simd-json - Rust port of simdjson
statik - Embed files into a Go executable
pike - Generate CRUD gRPC backends from single YAML description.
go - The Go programming language
sqlparser-rs - Extensible SQL Lexer and Parser for Rust
edotool - edotool: simulate keyboard input and mouse activity
grpc-gateway - gRPC to JSON proxy generator following the gRPC HTTP spec
statics - :file_folder: Embeds static resources into go files for single binary compilation + works with http.FileSystem + symlinks
pggen - A database first code generator focused on postgres
buildtools - A bazel BUILD file formatter and editor
sqlite