apache-spark-docker
uber-expenses-tracking
apache-spark-docker | uber-expenses-tracking | |
---|---|---|
1 | 2 | |
40 | 94 | |
- | - | |
0.0 | 2.6 | |
almost 2 years ago | almost 2 years ago | |
VBA | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
apache-spark-docker
-
Data Engineering Projects for Beginners
Learn how to dockerize an Apache Spark Standalone Cluster
uber-expenses-tracking
-
Data Engineering Projects for Beginners
Tracking your Uber Rides and Uber Eats expenses through a data engineering process
- uber-expenses-tracking
What are some alternatives?
docker-livy - Dockerizing and Consuming an Apache Livy environment
docker-hadoop - Apache Hadoop docker image
airflow-docker - This is my Apache Airflow Local development setup on Windows 10 WSL2/Mac using docker-compose. It will also include some sample DAGs and workflows.
Dropout-Students-Prediction - The goal of this project is to identify students at risk of dropping out the school
text-analysis-speeches-amlo - Text analysis of the speeches, conferences and interviews of the current president of Mexico
recommendation-system - Build a Content-Based Movie Recommender System (TF-IDF, BM25, BERT)
AWS Data Wrangler - pandas on AWS - Easy integration with Athena, Glue, Redshift, Timestream, Neptune, OpenSearch, QuickSight, Chime, CloudWatchLogs, DynamoDB, EMR, SecretManager, PostgreSQL, MySQL, SQLServer and S3 (Parquet, CSV, JSON and EXCEL).
dados-censup - Automação da ingestão de dados disponibilizados pelo INEP referente ao censo superior da educacão brasileira.
pyspark-on-aws-emr - The goal of this project is to offer an AWS EMR template using Spot Fleet and On-Demand Instances that you can use quickly. Just focus on writing pyspark code.