How Sendoso is using Kafka for Event-Driven Architecture

This page summarizes the projects mentioned and recommended in the original post on dev.to

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • kafka-connect-jdbc

    Kafka Connect connector for JDBC-compatible databases

  • Event sourcing is an effective architectural pattern to record changes to the application state. Event sequence is important - we need changes as they were originally applied. incoming events are first persisted into Kafka and then processed by services independently. Kafka, hence, becomes our source of truth (SOT), a data source that gives a complete picture of the data object as a whole. This, however, meant dramatic changes in our core application. Our source of truth is still the core database, but we generate events in Kafka when data gets persisted. To ensure transactional behavior, we employed Transactional Outbox pattern. Essentially, we a) created a new events table in the database, b) wrote event data in the same transaction when we update our SOT table. Kafka Connect is subsequently used to read this table and insert records in relevant Kafka topics. This ensures that we never have an inconsistent situation where data was inserted in the database but the event is not added to Kafka topic or vice versa. We evaluated a few connectors for sourcing data from Mysql (JdbcSourceConnector, and Debezium). Our scenario was supported out of the box in JdbcSourceConnector, making it possible to have one event table in Mysql where different rows could be routed to a relevant topic based on the topic field.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • kafdrop

    Kafka Web UI

  • Kafdrop

  • nri-prometheus

    Fetch metrics in the Prometheus metrics inside or outside Kubernetes and send them to the New Relic Metrics platform.

  • Kafdrop is a web UI for viewing Kafka topics and browsing consumer groups. The tool displays information such as brokers, topics, partitions, consumers. It also has topic creation and deletion functionality. Besides that, it allows us to observe consumer lag, and messages from any partition and offset. While this is not a replacement for monitoring and observability, it has been beneficial to visualize what is happening on the Kafka brokers. We have configured separate Kafdrop pods for each MSK cluster. For monitoring and alerting Kafka metrics, we are relying on newrelic-prometheus integration, to forward MSK metrics to New Relic. This has also been deployed as a Kubernetes pod.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • Knative switchboard series, part 1. Setup Knative Eventing with Kafka from scratch, scale based on events volume, and monitor

    4 projects | dev.to | 4 Jan 2024
  • How to self host Apache Kafka?

    1 project | /r/selfhosted | 22 Feb 2023
  • Kafka visualization tool

    1 project | /r/apachekafka | 15 Feb 2023
  • Local app to debug pub/sub?

    1 project | /r/googlecloud | 6 Feb 2023
  • Prometheus Additional Scrape Config node metrics limitation

    1 project | /r/PrometheusMonitoring | 21 Jul 2022