common-workflow-language
cgpipe
common-workflow-language | cgpipe | |
---|---|---|
6 | 1 | |
1,440 | 3 | |
0.3% | - | |
1.1 | 5.2 | |
5 months ago | 3 months ago | |
Common Workflow Language | Java | |
Apache License 2.0 | BSD 3-clause "New" or "Revised" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
common-workflow-language
-
Nextflow: Data-Driven Computational Pipelines
https://www.commonwl.org/
https://github.com/common-workflow-language/common-workflow-...
- Common Workflow Language (CWL)
- Are there any good resources for building data pipelines?
- Common Workflow Language
-
Kestra - Open-Source Airflow Alternative
And we can add it to the ever-growing list of existing workflow frameworks; https://github.com/common-workflow-language/common-workflow-language/wiki/Existing-Workflow-systems
cgpipe
-
Nextflow: Data-Driven Computational Pipelines
I do too.. and have similar opinions. I wrote my own tool years back for pipelines because it was always frustrating (started roughly around the same time as Nextflow).
Allowing for files to be marked as transient (temp) and re-running from arbitrary time points are definitely one of the things I support... as is conditional logic within the pipeline for job definition and resource usage. For me though, one of the biggest things is that I like having composable pipelines, so each part of the larger workflow can be developed independently. They can interact with each other (DAG) and use existing dependencies, but they don't have to exist in the same document/script. I work on large WGS datasets, so 1000's of jobs per patient isn't uncommon.
Happy to talk more if you're interested.
https://github.com/compgen-io/cgpipe
(And yes, you can dry run the entire thing. It will write out a bash script if you want to see exactly what is going to run without submitting jobs.)
What are some alternatives?
kestra - Infinitely scalable, event-driven, language-agnostic orchestration and scheduling platform to manage millions of workflows declaratively in code.
nextflow - A DSL for data-driven computational pipelines
redun - Yet another redundant workflow engine
infinitic - Infinitic is a scalable workflow engine for distributed services. It shines particularly by making complex orchestration simple. It can be used to reliably orchestrate microservices, manage distributed transactions, operates data pipelines, builds user-facing automation, etc.
common-workflow-
toil - A scalable, efficient, cross-platform (Linux/macOS) and easy-to-use workflow engine in pure Python.
huey - a little task queue for python
awesome-workflow-engines - A curated list of awesome open source workflow engines
Kedro - Kedro is a toolbox for production-ready data science. It uses software engineering best practices to help you create data engineering and data science pipelines that are reproducible, maintainable, and modular.
gh-action-pypi-publish - The blessed :octocat: GitHub Action, for publishing your :package: distribution files to PyPI: https://github.com/marketplace/actions/pypi-publish