auth
pipeline
auth | pipeline | |
---|---|---|
13 | 51 | |
826 | 8,289 | |
2.9% | 0.3% | |
7.6 | 9.7 | |
17 days ago | 2 days ago | |
TypeScript | Go | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
auth
-
Push code with GitHub Actions to Google Cloud’s Artifact Registry
This workflow will authenticate with Google Cloud using the Google Cloud auth GitHub Action and use Docker to authenticate and push to the registry. To make this workflow work (or flow?) we need to set up some Google Cloud resources and add in those values for our environment variables. Make sure to add in the value for PROJECT_ID where you have permission to create resources. The value for IMAGE_NAME can be anything — it’ll be created the first time this workflow runs:
-
GitHub Actions could be so much better
The issue of integration with other tools is also quite strange. Of course, this is not directly related to github actions. For example, what needs to be done to use cloud run https://github.com/google-github-actions/auth#setting-up-wor...
- you must have the "bigquery.datasets.create" permission on the selected project
-
IAM Best Practices [cheat sheet included]
While it is commonly associated with AWS, and their AWS IAM service, IAM is not limited to their platform. All cloud providers, such as Google Cloud and Azure DevOps, offer IAM solutions that allow users to access resources and systems. If you are looking for specific AWS IAM best practices, look no further than our AWS IAM Security Best Practices article:\ For the rest of this article, we will look at the generic best practices that have evolved over the last decade around each part of the basic question we started with, "who can access what?":
-
How would I use Github Actions to run a Python Script to make changes to a Google Sheets Spreadsheet?
I found this but I don't quite get how it works. I haven't done all the steps yet but I get how to set it up. I just don't understand how this just magically authenticates future steps since my code still needs a token. Should I use this to authenticate the script? If so, how do I do it and what would I need in my code? If not what should I use instead?
-
Cloud Incident Response
Cloud Identity and Access Management: This service provides fine-grained control over who has access to what resources within an organization's Google Cloud environment. It can be used to quickly revoke access to compromised accounts or limit access to sensitive resources. https://cloud.google.com/iam
-
Advanced GitHub Actions - Conditional Workflow
I use google-github-actions/auth in the first step in my job to authenticate to GCP. At this point, I have 6 different GitHub secrets to test out the concept. Each branch has two secrets with the format BRANCH_WIP and BRANCH_SA.
-
Learning Journal 3: Brainstorm a deployment process from GitHub to Google App Engine and Cloud SQL (Part 2)
There are 2 core parts authentication to GCP and App Engine deployment. Authentication is performed using auth, while a deployment uses deploy-appengine.
-
CI/CD from GitHub to Google Cloud Platform(GAE)
You should have a look at using workload identity federation and OIDC tokens. There’s a guide on https://github.com/google-github-actions/auth It means you no longer need to hardcode service account credentials in GitHub secrets anymore.
-
Learning Journal 2: Brainstorm a deployment process from GitHub to Google App Engine and Cloud SQL (Part 1)
Yes, there is a deploy-appengine action that automates the whole App Engine deployment process. Indeed, it uses gcloud commands underneath too. Either way, both approaches need an auth action to authenticate to GCP before any task can be performed.
pipeline
-
14 DevOps and SRE Tools for 2024: Your Ultimate Guide to Stay Ahead
Tekton
- GitHub Actions could be so much better
-
Distributed Traces for Testing with Tekton Pipelines and Tracetest
Tekton is an open-source framework for creating efficient CI/CD systems. This empowers developers to seamlessly construct, test, and deploy applications across various cloud environments and on-premise setups.
-
Practical Tips for Refactoring Release CI using GitHub Actions
Despite other alternatives like Circle CI, Travis CI, GitLab CI or even self-hosted options using open-source projects like Tekton or Argo Workflow, the reason for choosing GitHub Actions was straightforward: GitHub Actions, in conjunction with the GitHub ecosystem, offers a user-friendly experience and access to a rich software marketplace.
-
Wolfi: A community Linux OS designed for the container and cloud-native era
[2]: https://github.com/tektoncd/pipeline/issues/5507#issuecommen...
- Nu stiu ce sa fac, orice sfat e bine venit
-
What are some good self-hosted CI/CD tools where pipeline steps run in docker containers?
Drone, or Tekton, Argo Workflows if you’re on k8s
-
Is Jenkins still the king?
If you want a step up, I would recommend trying out Tekton Pipelines. It’s a very popular ci tool, and it runs on Kubernetes. Yes, this would involve setting up a Kubernetes cluster but please don’t run for the hills! You can setup a Kubernetes cluster and install Tekton on top of it with minimal setup using minikube (see here. This would be a great joint exercise as it will give you a bit of Kubernetes understanding alongside it, and the mechanisms of Tekton are a little trickier than GitHub actions imo. It’s all much the same though.
- Is there a way to run a one-off pod that would work as a command line tool?
-
K8s powered Git push deployments
I've recently found this quote by Kelsey Hightower:
"I'm convinced the majority of people managing infrastructure just want a PaaS. The only requirement: it has to be built by them."
Source: https://twitter.com/kelseyhightower/status/85193508753294540...
In the last few weeks, I've experimented a bit with Flux (https://fluxcd.io/), Tekton (https://tekton.dev/) and Cloud Native Buildpacks (https://buildpacks.io/) on how to provide K8s powered git push deployments without using a dedicated CI/CD server.
My project is still in early alpha stage and just a proof of concept :-) My vision is to expand it into an Open Source PaaS in the future.
Do you think the above quote is true? What does an open source PaaS need to be like in order to be accepted by software developers?
Some other projects have been discontinued in the past (like Flynn or Deis) or were created before the Kubernetes era.
Is it the right direction to provide a Heroku like solution based on K8s or is it better to provide an Open Source Infrastructure as Code library with building blocks to avoid everything from scratch?
What are some alternatives?
Aegis - A free, secure and open source app for Android to manage your 2-step verification tokens.
dagger - Application Delivery as Code that Runs Anywhere
angular-auth-oidc-client - npm package for OpenID Connect, OAuth Code Flow with PKCE, Refresh tokens, Implicit Flow
argo-cd - Declarative Continuous Deployment for Kubernetes
google-auth-library-nodejs - 🔑 Google Auth Library for Node.js
kubevela - The Modern Application Platform.
act - Run your GitHub Actions locally 🚀
tekton-argocd-poc - This a PoC using Tekton (for CI) and ArgoCD (CD). It uses a local k8s cluster (K3D)
azure-pipelines-agent - Azure Pipelines Agent 🚀
NUKE - 🏗 The AKEless Build System for C#/.NET
harden-runner - Network egress filtering and runtime security for GitHub-hosted and self-hosted runners
skaffold - Easy and Repeatable Kubernetes Development