Top 7 Groovy Docker Projects
-
devops-resources
DevOps resources - Linux, Jenkins, AWS, SRE, Prometheus, Docker, Python, Ansible, Git, Kubernetes, Terraform, OpenStack, SQL, NoSQL, Azure, GCP
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
gradle-docker-compose-plugin
Simplifies usage of Docker Compose for integration testing in Gradle environment.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
> It's been a while since you can rerun/resume Nextflow pipelines
Yes, you can resume, but you need your whole upstream DAG to be present. Snakemake can rerun a job when only the dependencies of that job are present, which allows to neatly manage the disk usage, or archive an intermediate state of a project and rerun things from there.
> and yes, you can have dry runs in Nextflow
You have stubs, which really isn't the same thing.
> I have no idea what you're referring to with the 'arbitrary limit of 1000 parallel jobs' though
I was referring to this issue: https://github.com/nextflow-io/nextflow/issues/1871. Except, the discussion doesn't give the issue a full justice. Nextflow spans each job in a separate thread, and when it tries to span 1000+ condor jobs it die with a cryptic error message. The option of -Dnxf.pool.type=sync and -Dnxf.pool.maxThreads=N prevents the ability to resume and attempts to rerun the pipeline.
> As for deleting temporary files, there are features that allow you to do a few things related to that, and other features being implemented.
There are some hacks for this - but nothing I would feel safe to integrate into a production tool. They are implementing something - you're right - and it's been the case for several years now, so we'll see.
Snakemake has all that out of the box.
Something that improved developer experience by far and also sped up our builds is starting the container dependencies via docker-compose and connect to it for integration testing. This allows reuse of containers, you can connect to it after/during an integration test to debug without having to keep searching for ports constantly.
With TestContainers - I've perceived that running integration tests / a single test repeatedly locally is extremely slow as the containers are shut down when the java process is killed. This approach allows for this while also allowing to keep it consistent - example, just mount the migrations folder in the start volume of your DB container and you have a like-for-like schema of your prod DB ready for integration tests.
I've found the https://github.com/avast/gradle-docker-compose-plugin/ very useful for this.
Groovy Docker related posts
Index
What are some of the best open-source Docker projects in Groovy? This list will help you:
Project | Stars | |
---|---|---|
1 | devops-resources | 8,192 |
2 | nextflow | 2,538 |
3 | awesome-kubernetes | 581 |
4 | gradle-docker-compose-plugin | 402 |
5 | jooq-plugin | 69 |
6 | gradle-use-python-plugin | 67 |
7 | ci-cd_project | 0 |
Sponsored