bytewax
timely-dataflow
Our great sponsors
bytewax | timely-dataflow | |
---|---|---|
18 | 11 | |
1,144 | 3,145 | |
8.2% | 1.1% | |
9.8 | 7.2 | |
2 days ago | 21 days ago | |
Python | Rust | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
bytewax
- Building a streaming SQL engine with Arrow and DataFusion
-
Near Real Time Ingestion to DB using Python
You can probably use Python to solve your problem, there are many ways you can speed up your deserialization/flattening. I work on Bytewax (https://github.com/bytewax/bytewax) and I wouldn't mention it if it wasn't a good fit, but I think it's worth looking at here. It is a stream processor that makes it easy to scale, maintain order, track progress, and you just write native Python.
-
Stream processing framework for a new project in Python
Disclaimer: I work on Bytewax, but it feels like this could be a good fit and would save you some time looking around. If you need to do stateful operations (reduce, window, etc.) then you can use bytewax - https://github.com/bytewax/bytewax with pub/sub, but you would need to build a custom connector. There are some guides on how to do that - https://www.bytewax.io/blog/custom-input-connector.
- What are your favorite tools or components in the Kafka ecosystem?
-
A Python package for streaming synthetic data
This is great, definitely see the utility here. I have had to hack this together so many times while building streaming workflows with github.com/bytewax/bytewax and other tools.
-
Snowflake - what are the streaming capabilities it provides?
When low latency matters you should always consider an ETL approach rather than ELT, e.g. collect data in Kafka and process using Kafka Streams/Flink in Java or Quix Streams/Bytewax in Python, then sink it to Snowflake where you can handle non-critical workloads (as is the case for 99% of BI/analytics). This way you can choose the right path for your data depending on how quickly it needs to be served.
-
Sunday Daily Thread: What's everyone working on this week?
Working on how to use https://github.com/bytewax/bytewax to create embeddings in real-time for ML use cases. I want to make a small library for embedding pipelines, but still learning about vector dbs and the tradeoffs between the different solutions.
-
Arroyo: A distributed stream processing engine written in Rust
Project looks cool! Glad you open sourced it. It could use some comments in the code base to help contributors ;). I also like the datafusion usage, that is awesome. BTW I work on github.com/bytewax/bytewax, which is based on https://github.com/TimelyDataflow/timely-dataflow another Rust dataflow computation engine.
-
Launch HN: BuildFlow (YC W23) – The FastAPI of data pipelines
Cool, nice idea. Can you sub in different backend like bytewax (https://github.com/bytewax/bytewax) for stateful processing?
-
Kafka Stream Processing in Java or Scala
If you want to keep in your Python/SQL area of expertise and by all means I don't mean to promote not learning a new language, but just as an FYI. There are some non-Java/Scala tools between streaming databases like risingwave and materialize, streaming platforms like fluvio and redpanda, and stream processors like bytewax and faust.
timely-dataflow
-
Readyset: A MySQL and Postgres wire-compatible caching layer
They have a bit about their technical foundation here[0].
Given that Readyset was co-founded by Jon Gjengset (but has apparently since departed the company), who authored the paper on Noria[1], I would assume that Readyset is the continuation of that research.
So it shares some roots with Materialize. They have a common conceptual ancestry in Naiad, where Materialize evolved out of timely-dataflow.
[0]: https://docs.readyset.io/concepts/streaming-dataflow
[1]: https://jon.thesquareplanet.com/papers/osdi18-noria.pdf
[2]: https://dl.acm.org/doi/10.1145/2517349.2522738
[3]: https://github.com/TimelyDataflow/timely-dataflow
-
Mandala: experiment data management as a built-in (Python) language feature
And systems like timely dataflow, https://github.com/TimelyDataflow/timely-dataflow
-
Arroyo: A distributed stream processing engine written in Rust
Project looks cool! Glad you open sourced it. It could use some comments in the code base to help contributors ;). I also like the datafusion usage, that is awesome. BTW I work on github.com/bytewax/bytewax, which is based on https://github.com/TimelyDataflow/timely-dataflow another Rust dataflow computation engine.
-
Rust MPI -- Will there ever be a fully oxidized implementation?
Just found this https://github.com/TimelyDataflow/timely-dataflow and my heart skipped a beat.
-
Streaming processing in Python using Timely Dataflow with Bytewax
Bytewax is a Python native binding to the Timely Dataflow library (written in Rust) for building highly scalable streaming (and batch) processing pipelines.
-
Alternative Kafka Integration Framework to Kafka Connect?
I am working on Bytewax, which is a Python stream processing framework built on Timely Dataflow. It is not exactly a Kafka integration framework because it is a more of a general stream processing framework, but might be interesting for you. We are focused on enabling people to more easily debug, containerize, parallelize and customize and less on enabling a declarative integration framework. It is still early days for us! And we are looking for feedback and ideas from the community.
-
[AskJS] JavaScript for data processing
We used to use a library called Pond.js, https://github.com/esnet/pond, but the reliance on Immutable.JS caused some performance pitfalls, so we wrote a system from scratch that deals with data in a batched streaming fashion. A lot of the concepts were borrowed from a Rust library called timely-dataflow, https://github.com/TimelyDataflow/timely-dataflow.
-
Dataflow: An Efficient Data Processing Library for Machine Learning
Though the name "Dataflow" might be an unfortunate name conflict with another Rust project: https://github.com/TimelyDataflow/timely-dataflow
-
Ask HN: Is there a way to subscribe to an SQL query for changes?
> In the simplest case, I'm talking about regular SQL non-materialized views which are essentially inlined.
I see that now -- makes sense!
> Wish we had some better database primitives to assemble rather than building everything on Postgres - its not ideal for a lot of things.
I'm curious to hear more about this! We agree that better primitives are required and that's why Materialize is written in Rust using using TimelyDataflow[1] and DifferentialDataflow[2] (both developed by Materialize co-founder Frank McSherry). The only relationship between Materialize and Postgres is that we are wire-compatible with Postgres and we don't share any code with Postgres nor do we have a dependence on it.
[1] https://github.com/TimelyDataflow/timely-dataflow
-
7 Real-Time Data Streaming Tools You Should Consider On Your Next Project
Under the hood, Materialize uses Timely Dataflow (TDF) as the stream-processing engine. This allows Materialize to take advantage of the distributed data-parallel compute engine. The great thing about using TDF is that it has been in open source development since 2014 and has since been battle-tested in production at large Fortune 1000-scale companies.
What are some alternatives?
arroyo - Distributed stream processing engine in Rust
noria - Fast web applications through dynamic, partially-stateful dataflow
2022-bytewax-redpanda-air-quality-monitoring
differential-datalog - DDlog is a programming language for incremental computation. It is well suited for writing programs that continuously update their output in response to input changes. A DDlog programmer does not write incremental algorithms; instead they specify the desired input-output mapping in a declarative manner.
django-unicorn - The magical reactive component framework for Django ✨
materialize - The data warehouse for operational workloads.
Django - The Web framework for perfectionists with deadlines.
realtime - Broadcast, Presence, and Postgres Changes via WebSockets
Pyramid - Pyramid - A Python web framework
differential-dataflow - An implementation of differential dataflow using timely dataflow on Rust.
Flask - The Python micro framework for building web applications.
flow - 🌊 Continuously synchronize the systems where your data lives, to the systems where you _want_ it to live, with Estuary Flow. 🌊