pronto | sqlc | |
---|---|---|
4 | 170 | |
6 | 11,012 | |
- | 3.9% | |
0.0 | 9.6 | |
over 3 years ago | 7 days ago | |
Java | Go | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pronto
-
Buf raises $93M to deprecate REST/JSON
5. Message streaming (gRPC streams are amazing)
I can think of a whole host of features that can be built off of protos (I've even built ORMs off of protobuffs for simple things [0]). The value prop is there IMO. HTTP + json APIs are a local minima. The biggest concerns "I want to be able to view the data that is being sent back and forth" is a tooling consideration (curl ... isn't showing you the voltages from the physical layer, it is decoded). Buff is building that tooling.
[0] - https://github.com/CaperAi/pronto
-
Parsing Gigabytes of JSON per Second
I've written translation layers for such systems and it's not too bad. See this project from $job - 1: https://github.com/CaperAi/pronto
It allowed us to have a single model for storage in the DB, for sending between services, and syncing to edge devices.
-
gRPC for Microservices Communication
There's no reason you couldn't use gRPC with json as a serialized message format. For example grpc-gateway [0] provides a very effective way of mapping a gRPC concept to HTTP/JSON. The thing is, after moving to gRPC, I've never really felt a desire to move back to JSON. While it may be correct to say "parsing json is fast enough" it's important to note that there's a "for most use cases" after that. Parsing protos is fast enough for even more use cases. You also get streams which are amazing for APIs where you have to sync some large amounts of data (listing large collections from a DB for example) across two services.
With gRPC you also have a standardized middleware API that is implemented for "all" languages. The concepts cleanly map across multiple languages and types are mostly solved for you.
Adding to that you can easily define some conventions for a proto and make amazing libraries for your team. At a previous job I made this: https://github.com/CaperAi/pronto/
Made it super easy to prototype multiple services as if you mock a service backed by memory we could plop it into a DB with zero effort.
I think this "gRPC vs X" method of thinking isn't appropriate here because protos are more like a Object.prototype in JavaScript. They're a template for what you're sending. If you have the Message you want to send you can serialize that to JSON or read from JSON or XML or another propriety format and automatically get a host of cool features (pretty printing, serialization to text/binary, sending over the network, etc).
[0] - https://github.com/grpc-ecosystem/grpc-gateway
-
We Went All in on Sqlc/Pgx for Postgres and Go
I attempted to make something similar to this except the opposite direction at a previous job. It was called Pronto: https://github.com/CaperAi/pronto/
It allowed us to store and query Protos into MongoDB. It wasn't perfect (lots of issues) but the idea was rather than specifying custom models for all of our DB logic in our Java code we could write a proto and automatically and code could import that proto and read/write it into the database. This made building tooling to debug issues very easy and make it very simple to hide a DB behind a gRPC API.
The tool automated the boring stuff. I wish I could have extended this to have you define a service in a .proto and "compile" that into an ORM DAO-like thing automatically so you never need to worry about manually wiring that stuff ever again.
sqlc
-
Show HN: Riza – Safely run untrusted code from your app
Hi HN, I’m Kyle and together with Andrew (https://news.ycombinator.com/user?id=stanleydrew) we’ve been working on Riza (https://riza.io), a project to make WASM sandboxing more approachable. We’re excited to share a developer preview of our code interpreter API with HN.
There’s a bit of a backstory here. A few months ago, an old coworker reached out asking how to execute untrusted code generated by an LLM. Based on our experience building a plugin system for sqlc (https://sqlc.dev), we thought a sandboxed WASM runtime would be a good fit. A bit of hacking later, we got everything wired up to solve his issue. Now the API is ready for other developers to try out.
The Riza Code Interpreter API is an HTTP interface to various dynamic language interpreters, each running inside a WASM sandbox without access to the outside world (for now). We modeled the API to align with a POSIX shell-style interface.
We made a playground so you can try it out without signing up: https://riza.io
The API documentation lives here: https://docs.riza.io
There are many limitations at the moment, but we expect to rapidly expand capabilities so that programs can e.g. access the network and filesystem. Our roadmap has more details: https://docs.riza.io/reference/roadmap
If you need to execute LLM-generated code we’d love to have you try the API and let us know if you run into any issues. You can email us directly at [email protected].
-
Give Up Sooner
"Is there a way to get sqlc to use pointers for nullable columns instead of the sql.Null types?"
-
Show HN: Sqlbind a Python library to compose raw SQL
I came across this yesterday for golang: https://sqlc.dev which is somewhat like what you want, maybe.
Not sure it allows you to parameterize table names but the basic idea is codegen from sql queries so you are working with go code (autocompletion etc).
- API completa em Golang - Parte 7
-
ORMs are nice but they are the wrong abstraction
Agreed, but tools like https://sqlc.dev, which I mention in the article, are a good trade-off that allows you to have verified, testable, SQL in your code.
- API completa em Golang - Parte 6
-
Go ORMs Compared
sqlc is not strictly a conventional ORM. It offers a unique approach by generating Go code from SQL queries. This allows developers to write SQL, which sqlc then converts into type-safe Go code, reducing the boilerplate significantly. It ensures that your queries are syntactically correct and type-safe. sqlc is ideal for those who prefer writing SQL and are looking for an efficient way to integrate it into a Go application.
-
Type-safe Data Access in Go using Prisma and sqlc
I was browsing awesome-go for ideas on how to setup my data access layer when I stumbled on sqlc. It seemed like a great option. Code generation is a strategy often used in the Go ecosystem and making my queries safe at compile time was an idea I really liked. Knex was great, but it required of me that I test thoroughly my queries at runtime and that I sanitize my query results to ensure type safety within my application.
-
Level UP your RDBMS Productivity in GO
Now, we are going to generate the code. For this purpose, we are going to use sqlc.
-
What 3rd-party libraries do you use often/all the time?
https://github.com/sqlc-dev/sqlc — for use with //go:generate
What are some alternatives?
simd-json - Rust port of simdjson
sqlx - general purpose extensions to golang's database/sql
pike - Generate CRUD gRPC backends from single YAML description.
GORM - The fantastic ORM library for Golang, aims to be developer friendly
sqlparser-rs - Extensible SQL Lexer and Parser for Rust
SQLBoiler - Generate a Go ORM tailored to your database schema.
grpc-gateway - gRPC to JSON proxy generator following the gRPC HTTP spec
ent - An entity framework for Go
pggen - A database first code generator focused on postgres
jet - Type safe SQL builder with code generation and automatic query result data mapping
sqlite
pgx - PostgreSQL driver and toolkit for Go