Ideas/Suggestions around setting up a data pipeline from scratch

This page summarizes the projects mentioned and recommended in the original post on /r/dataengineering

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • delta

    An open-source storage framework that enables building a Lakehouse architecture with compute engines including Spark, PrestoDB, Flink, Trino, and Hive and APIs (by delta-io)

    As the data source, what I have is a gRPC stream. I get data in protobuf encoded format from it. This is a fixed part in the overall system, there is no other way to extract the data. We plan to ingest this data in delta lake, but before we do that there are a few problems.

  • tonic

    A native gRPC client & server implementation with async/await support.

    If I’m not misunderstanding, you could both decode the gRPC protobuf AND write to delta lake in Rust. Tonic, Delta-rs.

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

  • delta-rs

    A native Rust library for Delta Lake, with bindings into Python

    If I’m not misunderstanding, you could both decode the gRPC protobuf AND write to delta lake in Rust. Tonic, Delta-rs.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts