Dagger: a new way to build CI/CD pipelines

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • dagger

    Application Delivery as Code that Runs Anywhere (by dagger)

  • > Seems to assume that all CI/CD workflows work in a single container at a time pattern.

    Dagger runs your workflows as a DAG, where each node is an action running in its own container. The dependency graph is detected automatically, and all containers that can be parallelized (based on their dependencies) will be parallelized. If you specify 10 actions to run, and they don't depend on each other, they will all run in parallel.

    > How about testing when I need to spin up an associated database container for my e2e tests. Is it possible, and just omitted from the documentation?

    It is possible, but not yet convenient (you need to connect to an external docker engine, via a docker CLI wrapped in a container) We are working on a more pleasant API that will support long-running containers (like your test DB) and more advanced synchronization primitives (wait for an action; terminate; etc.)

    This is discussed in the following issues:

    - https://github.com/dagger/dagger/issues/1337

    - https://github.com/dagger/dagger/issues/1249

    - https://github.com/dagger/dagger/issues/1248

  • Dagger2

    A fast dependency injector for Android and Java.

  • I support the effort to build a platform-agnostic CI/CD pipeline solution, but I don't want it in the form of yet another platform. Rather it needs to be a protocol that any platform can tie in to. I'm especially wary since this is another VC-backed effort that will eventually need to be monetized in some shape or form.

    Additionally, as someone else here has already mentioned, my mind first went to Dagger, the dependency injection tool (https://dagger.dev). That tool in particular was named as a play on DAG (directed acyclic graphs), whereas in this case I don't think it would apply since there may be instances where you'd want cycles in a pipeline.

    On a whim, I clicked on "Trademark Guidelines" (https://dagger.io/trademark) and from that page alone I would recommend avoiding this based on the aggressive language used to try and claim ownership of generic words. According to their own language, it seems I'm violating their guidelines by writing this comment.

    > Our Marks consist of the following registered, unregistered and/or pending trademarks, service marks and logos which are subject to change without notice: Dagger; Blocklayer; and other designs, logos or marks which may be referred to in your specific license agreement or otherwise.

    > Blocklayer does not permit using any of our Marks ... to identify non-Blocklayer products, services or technology

    Which would include Dagger, the dependency injection tool.

    Other sections of note:

    > Do Not Use As Nouns

    (This one just reads amusingly to me, for some reason.)

    > Do Not Create Composite Marks

    This section seems to suggest that you can't use "dagger" in any shape or form, even as a smaller part of some other word or body of text.

    > Websites And Domain Name Uses

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
  • buildkit

    concurrent, cache-efficient, and Dockerfile-agnostic builder toolkit

  • * Parallelism is the simultaneous execution of multiple things (possibly related, possibly not)

    * Dagger is designed to be highly concurrent with minimal development effort. Compared to an equivalent configuration in a traditional CI system, your Dagger pipelines will be more concurrent, and require less lines of code.

    * Because it is highly concurrent, Dagger can be parallelized with relatively little effort. But as you pointed out, you still need to configure parallelization by setting up multiple nodes, etc. Dagger uses buildkit as an excution engine, so parallelizing Dagger boils down to parallelizing buildkit. There is a lot of work in this area (one benefit of building on an existing, mature ecosystem). For example, here is an example of Kubernetes deployment: https://github.com/moby/buildkit/tree/master/examples/kubern...

  • dagger-for-github

    Discontinued GitHub Action for Dagger

  • Fun dact, Crazy Max is the author of the Github Action for Dagger :) https://github.com/dagger/dagger-for-github

  • Scoop

    A command-line installer for Windows.

  • > C:\\dagger.exe

    I'm glad you have a non-admin fallback, but also: yuck. I don't want this polluting my home folder (more importantly: I don't want 100's of other things like this also polluting my home folder).

    The "Windows way" is to install system-wide to %ProgramFiles%\dagger\ (eg c:\Program files\dagger\dagger.exe), or to install to %LocalAppData%\dagger\ (eg: c:\Users\shykes\AppData\Local\dagger\dagger.exe). The latter is kind of the closest equivalent to $HOME/.dagger on linux. Add whatever folder to the user's PATH environment variable to make it easy to run.

    Honestly, providing just the .zip is better: then Windows users can muck up their own system however they like. Alternatively, package it with something like Scoop [2] which is a fairly popular dev tool, and provides a fairly easy way to get a sane install with updates, versioning and path stuff all handled.

    [1] https://docs.dagger.io/

    [2] https://scoop.sh/

  • cuetorials.com

    Learn you some CUE for a great good!

  • The language is CUE, which I think will see mass adoption in config / DevOps in the coming years. So regardless what you think of the language today, it is likely to become important and part of your like in the not too distant future.

    https://cuelang.org | https://cuetorials.com

    Dagger builds on top of CUE and the (DAG) flow engine therein

  • cue

    The home of the CUE language! Validate and define text-based and dynamic configuration

  • The language is CUE, which I think will see mass adoption in config / DevOps in the coming years. So regardless what you think of the language today, it is likely to become important and part of your like in the not too distant future.

    https://cuelang.org | https://cuetorials.com

    Dagger builds on top of CUE and the (DAG) flow engine therein

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • cuezel

  • I played with a similar idea a while ago: https://github.com/ecordell/cuezel/ (cuezel as in: "Bazel but with CUE"), but I was never sure that what I was doing was in the spirit of CUE.

    CUE pushes nondeterminism into "_tool.cue"[0] files that are allowed to do things like IO and run external processes. Tool files scratch a similar itch to Makefiles, but they lack an integrated plugin system like Bazel (hence why I played with the idea of CUE + Bazel).

    With Dagger you seem to be restricted to the set of things that the dagger tool can interpret just with like my Cuezel tool you are limited to what I happened to implement.

    In CUE `_tool` files you are also limited to the set of things that the tool builtins provide, but the difference is that you know that the rest of the CUE program is deterministic/pure (everything not in a _tool file).

    There's clearly value in tooling that reads CUE definitions, and dagger is the first commercial interest in CUE that I've seen, which is exciting.

    But I'm most interested in some CUE-interpreter meta-tool that would allow you to import cue definitions + their interpreters and version them together, but for use in `_tool` files to keep the delineation clear. Maybe this is where dagger is heading? (if so it wasn't clear from the docs)

    [0]: https://pkg.go.dev/cuelang.org/[email protected]/pkg/tool

  • dagster

    An orchestration platform for the development, production, and observation of data assets.

  • Also not to be confused with https://dagster.io/

  • yplatform

    Self-service bootstrap/build/CI/CD. Software and configuration that supports various cycles of software development.

  • earthly

    Super simple build framework with fast, repeatable builds and an instantly familiar syntax – like Dockerfile and Makefile had a baby.

  • Another *monster* difference is that Dagger is (at least currently) Apache 2: https://github.com/dagger/dagger/blob/v0.2.4/LICENSE but Earthly went with BSL: https://github.com/earthly/earthly/blob/v0.6.12/LICENSE

    That means I'm more likely to submit bugs and patches to Dagger, and I won't touch Earthly

  • Docker

    Notary is a project that allows anyone to have trust over arbitrary collections of data

  • I'm not touching anything Docker anymore.

    Here's the scenario: you're the unfortunate soul who received the first M1 as a new employee, and nothing Docker-related works. Cue multi-arch builds; what a rotten mess. I spent more than a week figuring out the careful orchestration that any build involving `docker manifest` needs. If you aren't within the very fine line that buildx assumes, good luck pal. How long has `docker manifest` been "experimental?" It's abandonware.

    Then I decided it would be smart to point out that we don't sign our images, and so I had to figure out how to combine the `docker manifest` mess with `docker trust`, another piece of abandonware. Eventually I figured out that the way to do it was with notary[1], another (poorly documented) piece of abandonware. The new shiny thing is notation[2], which does exactly the same thing, but is nowhere near complete.

    At least Google clearly signals that they are killing something, Docker just lets projects go quiet.

    How long before this project lands up like the rest of them? Coincidentally, we were talking about decoupling our CI from proprietary CI, seeing this was a rollercoaster of emotions.

    [1]: https://github.com/notaryproject/notary

  • notation

    A CLI tool to sign and verify artifacts (by notaryproject)

  • cloudflared

    Cloudflare Tunnel client (formerly Argo Tunnel)

  • > This is similar to installing something under /usr/sbin/

    As someone who's trying to get to grips with the Linux filesystem conventions, would you mind elaborating on a) why that's wrong, and b) what you would suggest instead? This reference[0] suggests that `/usr/sbin` is for "general system-wide binaries with superuser (root) privileges required" (and `/usr/bin` for those that don't require root privileges). I've therefore been using them in my homelab for binaries like the Cloudflare Tunnel Client[1]. Where instead should I have installed it? If to a custom location of my choosing, how should I communicate its location to scripts/tools that _use_ that binary? I see later in your comment that you suggest "Add whatever folder to the user's PATH environment variable to make it easy to run.", but that doesn't seem like a scalable solution for a multi-user environment?

    [0] https://askubuntu.com/a/308048

    [1] https://github.com/cloudflare/cloudflared

  • pipeline

    A cloud-native Pipeline resource.

  • Thanks for answering Qs. Does this compete directly with Tekton ( https://tekton.dev/ ), or do you imagine a way the two could interoperate? Why choose Dagger over Tekton to power pipelines?

  • busybox-w32

    WIN32 native port of BusyBox.

  • I love unix tools (grep, sed, cut, etc.), and while there are some good sub-systems (msys2, cygwin), they might be bit heavey. For that the windows version of busybox - https://frippery.org/busybox/ - and then I make sure my scripts are not using too powerful features of said tools (grep especially), such that the version in busybox works. Great, and also possible to port some of that back to linux (but I mostly use it to build something, or extract some data but want to share the .bat file with others - one day when I get better in PowerShell I'll try there more).

  • constructs

    Define composable configuration models through code

  • Have you heard of or explored https://github.com/aws/constructs (related: https://github.com/aws/jsii and https://github.com/aws/aws-cdk)?

    This is what CDK uses for declarative modeling, but gives the opportunity to use languages/tooling that most devs are already familiar with. CDK8s already uses it as a replacement for yaml (technically, the yaml becomes an implementation detail rather than actually replaced)

  • jsii

    jsii allows code in any language to naturally interact with JavaScript classes. It is the technology that enables the AWS Cloud Development Kit to deliver polyglot libraries from a single codebase!

  • Have you heard of or explored https://github.com/aws/constructs (related: https://github.com/aws/jsii and https://github.com/aws/aws-cdk)?

    This is what CDK uses for declarative modeling, but gives the opportunity to use languages/tooling that most devs are already familiar with. CDK8s already uses it as a replacement for yaml (technically, the yaml becomes an implementation detail rather than actually replaced)

  • aws-cdk

    The AWS Cloud Development Kit is a framework for defining cloud infrastructure in code

  • Have you heard of or explored https://github.com/aws/constructs (related: https://github.com/aws/jsii and https://github.com/aws/aws-cdk)?

    This is what CDK uses for declarative modeling, but gives the opportunity to use languages/tooling that most devs are already familiar with. CDK8s already uses it as a replacement for yaml (technically, the yaml becomes an implementation detail rather than actually replaced)

  • cloak

    Secrets automation for developers (by purton-tech)

  • I've been using Earthly for about 6 months.

    Earthly uses Dockerfile style syntax so I don't have to learn a new language, I can leverage my existing knowledge.

    Another advantage is that in Earthly I can run up a docker compose within my pipeline so that I have selenium, envoy and postgres running for integration testing.

    You can see my integration tests here https://github.com/purton-tech/cloak/blob/main/Earthfile#L14...

    Is that possible in dagger?

  • Dagger.jl

    A framework for out-of-core and parallel execution

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts