patterns-devkit
SmartPipeline
patterns-devkit | SmartPipeline | |
---|---|---|
5 | 1 | |
106 | 22 | |
0.0% | - | |
2.9 | 7.5 | |
about 1 year ago | 2 months ago | |
Python | Python | |
BSD 3-clause "New" or "Revised" License | GNU Lesser General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
patterns-devkit
SmartPipeline
What are some alternatives?
pyspark-example-project - Implementing best practices for PySpark ETL jobs and applications.
Mage - 🧙 The modern replacement for Airflow. Mage is an open-source data pipeline tool for transforming and integrating data. https://github.com/mage-ai/mage-ai
Dataplane - Dataplane is a data platform that makes it easy to construct a data mesh with automated data pipelines and workflows.
seq-pipeline - Workspace for data science projects and NGS pipelines. Contains RStudio, Jupyter Notebook, VSCode and file manager. Can connect to Tailscale network to bypass firewalls.
pipebird - Pipebird is open source infrastructure for securely sharing data with customers.
dagster - An orchestration platform for the development, production, and observation of data assets.
hamilton - Hamilton helps data scientists and engineers define testable, modular, self-documenting dataflows, that encode lineage and metadata. Runs and scales everywhere python does.
versatile-data-kit - One framework to develop, deploy and operate data workflows with Python and SQL.
AWS Data Wrangler - pandas on AWS - Easy integration with Athena, Glue, Redshift, Timestream, Neptune, OpenSearch, QuickSight, Chime, CloudWatchLogs, DynamoDB, EMR, SecretManager, PostgreSQL, MySQL, SQLServer and S3 (Parquet, CSV, JSON and EXCEL).
flowrunner - Flowrunner is a lightweight package to organize and represent Data Engineering/Science workflows
prism - Prism is the easiest way to develop, orchestrate, and execute data pipelines in Python.