pronto
pggen
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pronto
-
Buf raises $93M to deprecate REST/JSON
5. Message streaming (gRPC streams are amazing)
I can think of a whole host of features that can be built off of protos (I've even built ORMs off of protobuffs for simple things [0]). The value prop is there IMO. HTTP + json APIs are a local minima. The biggest concerns "I want to be able to view the data that is being sent back and forth" is a tooling consideration (curl ... isn't showing you the voltages from the physical layer, it is decoded). Buff is building that tooling.
[0] - https://github.com/CaperAi/pronto
-
Parsing Gigabytes of JSON per Second
I've written translation layers for such systems and it's not too bad. See this project from $job - 1: https://github.com/CaperAi/pronto
It allowed us to have a single model for storage in the DB, for sending between services, and syncing to edge devices.
-
gRPC for Microservices Communication
There's no reason you couldn't use gRPC with json as a serialized message format. For example grpc-gateway [0] provides a very effective way of mapping a gRPC concept to HTTP/JSON. The thing is, after moving to gRPC, I've never really felt a desire to move back to JSON. While it may be correct to say "parsing json is fast enough" it's important to note that there's a "for most use cases" after that. Parsing protos is fast enough for even more use cases. You also get streams which are amazing for APIs where you have to sync some large amounts of data (listing large collections from a DB for example) across two services.
With gRPC you also have a standardized middleware API that is implemented for "all" languages. The concepts cleanly map across multiple languages and types are mostly solved for you.
Adding to that you can easily define some conventions for a proto and make amazing libraries for your team. At a previous job I made this: https://github.com/CaperAi/pronto/
Made it super easy to prototype multiple services as if you mock a service backed by memory we could plop it into a DB with zero effort.
I think this "gRPC vs X" method of thinking isn't appropriate here because protos are more like a Object.prototype in JavaScript. They're a template for what you're sending. If you have the Message you want to send you can serialize that to JSON or read from JSON or XML or another propriety format and automatically get a host of cool features (pretty printing, serialization to text/binary, sending over the network, etc).
[0] - https://github.com/grpc-ecosystem/grpc-gateway
-
We Went All in on Sqlc/Pgx for Postgres and Go
I attempted to make something similar to this except the opposite direction at a previous job. It was called Pronto: https://github.com/CaperAi/pronto/
It allowed us to store and query Protos into MongoDB. It wasn't perfect (lots of issues) but the idea was rather than specifying custom models for all of our DB logic in our Java code we could write a proto and automatically and code could import that proto and read/write it into the database. This made building tooling to debug issues very easy and make it very simple to hide a DB behind a gRPC API.
The tool automated the boring stuff. I wish I could have extended this to have you define a service in a .proto and "compile" that into an ORM DAO-like thing automatically so you never need to worry about manually wiring that stuff ever again.
pggen
-
Ask HN: ORM or Native SQL?
Cornucopia is neat. I wrote a similar library in Go [1] so I'm very interested in comparing design decisions.
The pros of the generated code per query approach:
- App code is coupled to query outputs and inputs (an API of sorts), not database tables. Therefore, you can refactor your DB without changing app code.
- Real SQL with the full breadth of DB features.
- Real type-checking with what the DB supports.
The cons:
- Type mapping is surprisingly hard to get right, especially with composite types and arrays and custom type converters. For example, a query might return multiple jsonb columns but the app code wants to parse them into different structs.
- Dynamic queries don't work with prepared statements. Prepared statements only support values, not identifiers or scalar SQL sub-queries, so the codegen layer needs a mechanism to template SQL. I haven't built this out yet but would like to.
[1]: https://github.com/jschaf/pggen
-
What are the things with Go that have made you wish you were back in Spring/.NET/Django etc?
pggen is another fantastic library in this genre, which specifically targets postgres. It is driven by pgx. Can not recommend enough.
-
Exiting the Vietnam of Programming: Our Journey in Dropping the ORM (In Golang)
> Do you write out 120 "INSERT" statements, 120 "UPDATE" statements, 120 "DELETE" statements as raw strings
Yes. For example: https://github.com/jschaf/pggen/blob/main/example/erp/order/....
> that is also using an ORM
ORM as a term covers a wide swathe of usage. In the smallest definition, an ORM converts DB tuples to Go structs. In common usage, most folks use ORM to mean a generic query builder plus the type conversion from tuples to structs. For other usages, I prefer the Patterns of Enterprise Application Architecture terms [1] like data-mapper, active record, and table-data gateway.
[1]: https://martinfowler.com/eaaCatalog/
-
Back to basics: Writing an application using Go and PostgreSQL
You might like pggen (I’m the author) which only supports Postgres and pgx. https://github.com/jschaf/pggen
pggen occupies the same design space as sqlc but the implementations are quite different. Sqlc figures out the query types using type inference in Go which is nice because you don’t need Postgres at build time. Pggen asks Postgres what the query types are which is nice because it works with any extensions and arbitrarily complex queries.
-
How We Went All In on sqlc/pgx for Postgres + Go
Any reason to use sqlc over pggen ? If you use Postgres, it seems like the superior option.
- We Went All in on Sqlc/Pgx for Postgres and Go
-
What are your favorite packages to use?
Agree with your choices, except go-json which I never tried. pggen is fantastic. Love that library. The underlying driver, pgx, is also really well written.
-
I don't want to learn your garbage query language
You might like the approach I took with pggen[1] which was inspired by sqlc[2]. You write a SQL query in regular SQL and the tool generates a type-safe Go querier struct with a method for each query.
The primary benefit of pggen and sqlc is that you don't need a different query model; it's just SQL and the tools automate the mapping between database rows and Go structs.
[1]: https://github.com/jschaf/pggen
[2]: https://github.com/kyleconroy/sqlc
-
What is the best way to use PostgreSQL with Go?
I created pggen a few weeks ago to create my preferred method of database interaction: I write real SQL queries and I use generated, type-safe Go interfaces to the queries. https://github.com/jschaf/pggen
What are some alternatives?
simd-json - Rust port of simdjson
sqlc - Generate type-safe code from SQL
pike - Generate CRUD gRPC backends from single YAML description.
SQLBoiler - Generate a Go ORM tailored to your database schema.
sqlparser-rs - Extensible SQL Lexer and Parser for Rust
sqlpp11 - A type safe SQL template library for C++
grpc-gateway - gRPC to JSON proxy generator following the gRPC HTTP spec
pggen - A database first code generator focused on postgres
SqlKata Query Builder - SQL query builder, written in c#, helps you build complex queries easily, supports SqlServer, MySql, PostgreSql, Oracle, Sqlite and Firebird
sqlite
honeysql - Turn Clojure data structures into SQL