db-operator
nifikop
db-operator | nifikop | |
---|---|---|
3 | 1 | |
154 | 118 | |
0.0% | - | |
0.0 | 7.7 | |
8 days ago | about 2 years ago | |
Go | Go | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
db-operator
-
Database for homelab - in or out of cluster?
For homelab, I'm using in-cluster databases. And I'm doing full volumes backups instead of database dumps. I have funkwhale and gitea that both require postgres. And I'm using postgres that is provided as chart dependency. But my plan is to setup one postgres database in the separate ns, and then use https://github.com/kloeckner-i/db-operator for managing it.
-
MySQL operators without the cluster
Look at https://github.com/kloeckner-i/db-operator , byt external secret is missing
- Operators for out-of-cluster databases
nifikop
-
Nifi on EKS CLUSTER
I don't have experience of Nifi, but I'd say your best option is either the operator: https://github.com/Orange-OpenSource/nifikop or the helm chart: https://github.com/cetic/helm-nifi
What are some alternatives?
varnish-operator - Run and manage Varnish clusters on Kubernetes
helmify - Creates Helm chart from Kubernetes yaml
egressip-ipam-operator - egressip-ipam-operator
argocd-operator - A Kubernetes operator for managing Argo CD clusters.
tor-controller - Tor toolkit for Kubernetes (Tor instances, onion services and more)
helm-nifi - Helm Chart for Apache Nifi
cloud-on-k8s - Elastic Cloud on Kubernetes
cluster-example - Terraform cluster example with Druid, Kafka and NiFi
keepalived-operator - An operator to manage VIPs backed by keepalived
kubernetes-operator-roiergasias - 'Roiergasias' kubernetes operator is meant to address a fundamental requirement of any data science / machine learning project running their pipelines on Kubernetes - which is to quickly provision a declarative data pipeline (on demand) for their various project needs using simple kubectl commands. Basically, implementing the concept of No Ops. The fundamental principle is to utilise best of docker, kubernetes and programming language features to run a workflow with minimal workflow definition syntax. It is a Go based workflow running on command line or Kubernetes with the help of a custom operator for a quick and automated data pipeline for your machine learning projects (a flavor of MLOps).
mysql-operator - Kubernetes operator to manage mysql databases and users
rbacsync - Automatically sync groups into Kubernetes RBAC