Getting started with Kafka Connector for Azure Cosmos DB using Docker

This page summarizes the projects mentioned and recommended in the original post on dev.to

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • kafka-connect-cosmosdb

    Kafka Connect connectors for Azure Cosmos DB

    The Azure Cosmos DB connector allows you to move data between Azure Cosmos DB and Kafka. It’s available as a source as well as a sink. The Azure Cosmos DB Sink connector writes data from a Kafka topic to an Azure Cosmos DB container and the Source connector writes changes from an Azure Cosmos DB container to a Kafka topic. At the time of writing, the connector is in pre-production mode. You can read more about it on the GitHub repo or install/download it from the Confluent Hub.

  • cosmosdb-kafka-connect-docker

    Getting started with Kafka Connector for Azure Cosmos DB using Docker

    git clone [https://github.com/Azure-Samples/cosmosdb-kafka-connect-docker](https://github.com/Azure-Samples/cosmosdb-kafka-connect-docker) cd cosmosdb-kafka-connect-docker

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

  • kafka-connect-datagen

    Connector that generates data for demos

    For the remaining scenarios, we will use a producer component to generate records. The Kafka Connect Datagen connector is our friend. It is meant for generating mock data for testing, so let’s put it to good use!

  • Docker Compose

    Define and run multi-container applications with Docker

    And, Docker Compose which is a tool for defining and running multi-container Docker applications. It will orchestrate all the components required by our setup including Azure Cosmos DB emulator, Kafka, Zookeeper, Kafka connectors etc.

  • Docker Swarm

    Source repo for Docker's Documentation (by docker)

    Having a local development environment is quite handy when trying out a new service or technology. Docker has emerged as the de-facto choice in such cases. It is specially useful in scenarios where you’re trying to integrate multiple services and gives you the ability to to start fresh before each run.

  • Apache Avro

    Apache Avro is a data serialization system.

    So far we dealt with JSON, a commonly used data format. But, Avro is heavily used in production due to its compact format which leads to better performance and cost savings. To make it easier to deal with Avro data schema, there is Confluent Schema Registry which provides a serving layer for your metadata along with a RESTful interface for storing and retrieving your Avro (as well as JSON and Protobuf schemas). We will use the Docker version for the purposes of this blog post.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts