Apache Spark VS Airflow

Compare Apache Spark vs Airflow and see what are their differences.

Apache Spark

Apache Spark - A unified analytics engine for large-scale data processing (by apache)

Airflow

Apache Airflow - A platform to programmatically author, schedule, and monitor workflows (by apache)
Our great sponsors
  • JetBrains - Developer Ecosystem Survey 2022
  • SonarQube - Static code analysis for 29 languages.
  • Scout APM - Less time debugging, more time building
Apache Spark Airflow
53 107
33,221 26,419
1.5% 3.1%
10.0 10.0
3 days ago 5 days ago
Scala Python
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Apache Spark

Posts with mentions or reviews of Apache Spark. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-06-21.

Airflow

Posts with mentions or reviews of Airflow. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-06-19.

What are some alternatives?

When comparing Apache Spark and Airflow you can also consider the following projects:

Trino - Official repository of Trino, the distributed SQL query engine for big data, formerly known as PrestoSQL (https://trino.io)

dagster - An orchestration platform for the development, production, and observation of data assets.

Kedro - A Python framework for creating reproducible, maintainable and modular data science code.

luigi - Luigi is a Python module that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow management, visualization etc. It also comes with Hadoop support built in.

Dask - Parallel computing with task scheduling

n8n - Free and open fair-code licensed node based Workflow Automation Tool. Easily automate tasks across different services.

argo - Workflow engine for Kubernetes

Pandas - Flexible and powerful data analysis / manipulation library for Python, providing labeled data structures similar to R data.frame objects, statistical functions, and much more

Apache Camel - Apache Camel is an open source integration framework that empowers you to quickly and easily integrate various systems consuming or producing data.

Scalding - A Scala API for Cascading

Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration

mrjob - Run MapReduce jobs on Hadoop or Amazon Web Services