dumbo VS Apache Spark

Compare dumbo vs Apache Spark and see what are their differences.

dumbo

Python module that allows one to easily write and run Hadoop programs. (by klbostee)

Apache Spark

Apache Spark - A unified analytics engine for large-scale data processing (by apache)
Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
dumbo Apache Spark
- 101
1,034 38,320
- 1.1%
0.0 10.0
over 6 years ago 3 days ago
Python Scala
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

dumbo

Posts with mentions or reviews of dumbo. We have used some of these posts to build our list of alternatives and similar projects.

We haven't tracked posts mentioning dumbo yet.
Tracking mentions began in Dec 2020.

Apache Spark

Posts with mentions or reviews of Apache Spark. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-03-11.

What are some alternatives?

When comparing dumbo and Apache Spark you can also consider the following projects:

mrjob - Run MapReduce jobs on Hadoop or Amazon Web Services

Trino - Official repository of Trino, the distributed SQL query engine for big data, formerly known as PrestoSQL (https://trino.io)

dpark - Python clone of Spark, a MapReduce alike framework in Python

Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration

streamparse - Run Python in Apache Storm topologies. Pythonic API, CLI tooling, and a topology DSL.

Airflow - Apache Airflow - A platform to programmatically author, schedule, and monitor workflows

luigi - Luigi is a Python module that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow management, visualization etc. It also comes with Hadoop support built in.

Scalding - A Scala API for Cascading

data-science-ipython-notebooks - Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines.

Apache Arrow - Apache Arrow is a multi-language toolbox for accelerated data interchange and in-memory processing