-
They already have! Connect (https://github.com/bufbuild/connect-web) is what you're looking for, as it's grpc-web compatible.
-
InfluxDB
Purpose built for real-time analytics at any scale. InfluxDB Platform is powered by columnar analytics, optimized for cost-efficient storage, and built with open data standards.
-
honeybuf
A sweet typescript serializer allowing you to integrate serialization into your classes, while having protobuf-like control.
The same reason along with the fact that you had to generate code, as well as usually needing to convert it to a class afterward was the reason I wrote my owner typescript-native binary serializer[0] (mostly based on C-FFI for compatibility) a few years ago.
[0]: https://github.com/i404788/honeybuf
-
Shameless plug to my project Phero [0]. It’s a bit like gRPC but specifically for full stack TypeScript projects.
It has a minimal API, literally one function, with which you can expose your server’s functions. It will generate a Typesafe SDK for your frontend(s), packed with all models you’re using. It will also generate a server which will automatically validate input & output to your server.
One thing I’ve seen no other similar solution so is the way we so error handling: throw an error on the server and catch it on the client as if it was a local error.
As I said, it’s only meant for teams who have full stack TypeScript. For teams with polyglot stacks an intermediate like protobuf or GraphQL might make more sense. We generate a TS declaration file instead.
[0] https://github.com/phero-hq/phero
-
My own search results:
[0] https://github.com/boguslaw-wojcik/encoding-benchmarks
-
-
we use it at https://woogles.io for pretty much all communication (server-to-server and client-to-server). I do loathe dealing with the JS aspect of it and am very excited to move over to protocol-es after reading this article (and shaving off a ton of repeated code and generated code).
-
protoc-gen-validate
Protocol Buffer Validation - Being replaced by github.com/bufbuild/protovalidate
My understanding is that the powers that be within Google have decided that validating messages is outside the scope of schemas and serialization. protoc-gen-validate provides a portable way to perform validation: https://github.com/bufbuild/protoc-gen-validate
The problem with required fields is it kicks the can down the road when you want to deprecate a field. Keeping everything optional is much, much better for everyone in the long run.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
-
at least in the frontend (without WASM), it depends.
i tested https://github.com/mapbox/pbf and while it was faster for deep/complex structs vs an unoptimized/repetative JSON blob, it was slower at shallow structs and flat arrays of stuff. if you spend a bit of time encode stuff as flat arrays to avoid mem alloc, JSON parsing wins by a lot since it goes through highly optimized C or assembly, while decoding protobuf in the JIT does not.
-
re: TypeScript doing "this number has to be less than 120".. As a fun aside in fact you _can_ write this kind of type today (since 4.5 when they added tail-recursion elimination on conditional types). You can even do things like `Range<80, 120>` to clamp to a range. If something like "Negated types" ever happens (https://github.com/Microsoft/TypeScript/pull/29317) it'll make even more options available.
Also, if you haven't checked it out, typescript-json-schema has some REALLY powerful validation it can do for things like your example (https://youtu.be/HHTDCY5uh_M?t=1379). You can do stuff like this
```ts
-
I prefer reading proto files with services over OpenAPI yaml. Here's the pet store example to compare.
- https://github.com/project-flogo/grpc/blob/master/proto/grpc...
- https://github.com/OAI/OpenAPI-Specification/blob/main/examp...
-
I prefer reading proto files with services over OpenAPI yaml. Here's the pet store example to compare.
- https://github.com/project-flogo/grpc/blob/master/proto/grpc...
- https://github.com/OAI/OpenAPI-Specification/blob/main/examp...
-
> At pbf speeds, decoding is usually no longer a bottleneck, but bandwidth might be when comparing with gzipped JSON.
we were streaming a few hundred datapoints in a dozen flat arrays over websocket at 20-40hz and needed to decode the payload eagerly. plain JSON was a multi-factor speedup over pbf for this case. but it's fully possible i was holding it wrong, too!
even if your "bottleneck" is rendering/rasterization (10ms), but your data pipe takes 5ms instead of 1ms, it's a real effect on framerate, battery, thermals, etc.
i'm a big fan of your work! while i have you here, would you mind reviewing this sometime soon? ;)
https://github.com/mourner/flatbush/pull/44
Related posts
-
Eval("quire".replace(/^/,"re"))(moduleName)
-
How we reverse-engineered Google Maps pagination
-
Add extra stuff to a “standard” encoding? Sure, why not
-
Question about compressing JSON in multiplayer position updates
-
Looking for ideas on how to implement serialization and deserialization between C++ and Javascript