Airflow VS streamify

Compare Airflow vs streamify and see what are their differences.

streamify

A data engineering project with Kafka, Spark Streaming, dbt, Docker, Airflow, Terraform, GCP and much more! (by ankurchavda)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
Airflow streamify
169 4
34,397 474
2.1% -
10.0 0.0
7 days ago about 2 years ago
Python Python
Apache License 2.0 -
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Airflow

Posts with mentions or reviews of Airflow. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-07.

streamify

Posts with mentions or reviews of streamify. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-03-21.

What are some alternatives?

When comparing Airflow and streamify you can also consider the following projects:

Kedro - Kedro is a toolbox for production-ready data science. It uses software engineering best practices to help you create data engineering and data science pipelines that are reproducible, maintainable, and modular.

eventsim - Event data simulator. Generates a stream of pseudo-random events from a set of users, designed to simulate web traffic.

dagster - An orchestration platform for the development, production, and observation of data assets.

terraform - Terraform enables you to safely and predictably create, change, and improve infrastructure. It is a source-available tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned.

n8n - Free and source-available fair-code licensed workflow automation tool. Easily automate tasks across different services.

eventsim - Event data simulator. Generates a stream of pseudo-random events from a set of users, designed to simulate web traffic.

luigi - Luigi is a Python module that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow management, visualization etc. It also comes with Hadoop support built in.

finnhub-streaming-data-pipeline - Stream processing pipeline from Finnhub websocket using Spark, Kafka, Kubernetes and more

Apache Spark - Apache Spark - A unified analytics engine for large-scale data processing

tfl-bikes-data-pipeline - Processing TFL data for bike usage with Google Cloud Platform.

Dask - Parallel computing with task scheduling

spark-bigquery-connector - BigQuery data source for Apache Spark: Read data from BigQuery into DataFrames, write DataFrames into BigQuery tables.