spring-cloud-stream-samples
cp-all-in-one
Our great sponsors
spring-cloud-stream-samples | cp-all-in-one | |
---|---|---|
1 | 9 | |
934 | 870 | |
1.1% | 4.0% | |
6.7 | 8.3 | |
2 months ago | 5 days ago | |
Java | Python | |
Apache License 2.0 | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
spring-cloud-stream-samples
-
Spring Cloud Stream & Kafka Confluent Avro Schema Registry
In this demo, based on the schema-registry-confluent-avro-serializer sample, we will create three Spring Cloud Stream applications, one consumer and two producers, all of them using the Confluent Schema Registry Server and the Confluent Avro Serializers.
cp-all-in-one
-
Apache Kafka Using Docker
Hi everyone,i'm using Kafka on Docker (https://github.com/confluentinc/cp-all-in-one/blob/7.3.3-post/cp-all-in-one/docker-compose.yml), when I run producer.py, it runs very smooth and consumer.py as well. however when I check the schema-register at localhost:8081 it is null and so is the Confluent Ui (localhost:9021). Is there anything missing? Thanks for your help!
-
OpenID Connect authentication with Apache Kafka 3.1
To make it more fun, I'm using Kafka in KRaft mode (so without Zookeeper) based on this example running in Docker provided by Confluent.
-
Spring Cloud Stream & Kafka Confluent Avro Schema Registry
We will use a docker-compose.yml based on the one from confluent/cp-all-in-one both to run it locally and to execute the integration tests. From that configuration we will keep only the containers: zookeeper, broker, schema-registry and control-center.
-
Kafka Streams application doesn't start up
There are a lot of extraneous services here, and CP version is very old. Current version is 7.1 with 7.2 on the way. Maybe look at using Confluent local services start with the Confluent CLI to run services locally or perhaps use https://github.com/confluentinc/cp-all-in-one as a good reference docker compose
-
I love Kafka, but I really can’t stand:
You can even just run the preview without Zookeeper in docker-compose https://github.com/confluentinc/cp-all-in-one/tree/7.0.1-post/cp-all-in-one-kraft
What are some alternatives?
docker-kafka-kraft - Apache Kafka Docker image using Kafka Raft metadata mode (KRaft). https://hub.docker.com/r/moeenz/docker-kafka-kraft
bitnami-docker-kafka - Bitnami Docker Image for Kafka
examples - Apache Kafka and Confluent Platform examples and demos
demo-scene - 👾Scripts and samples to support Confluent Demos and Talks. ⚠️Might be rough around the edges ;-) 👉For automated tutorials and QA'd code, see https://github.com/confluentinc/examples/
debezium - Change data capture for a variety of databases. Please log issues at https://issues.redhat.com/browse/DBZ.
NiFItoKafkaConnect - NiFi -> Kafka Connect -> HDFS
fake-data-producer-for-apache-kafka-docker - Fake Data Producer for Aiven for Apache Kafka® in a Docker Image
gradle-avro-plugin - A Gradle plugin to allow easily performing Java code generation for Apache Avro. It supports JSON schema declaration files, JSON protocol declaration files, and Avro IDL files.
cp-docker-images - [DEPRECATED] Docker images for Confluent Platform.
fast-data-dev - Kafka Docker for development. Kafka, Zookeeper, Schema Registry, Kafka-Connect, Landoop Tools, 20+ connectors
spring-cloud-stream-kafka-confluent-avro-schema-registry - 🍀 Spring Cloud Stream Kafka & Confluent Avro Schema Registry