A curated re-sources list for awesome Apache Kafka (by jitendra3109)

ApacheKafka Alternatives

Similar projects and alternatives to ApacheKafka

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better ApacheKafka alternative or higher similarity.

ApacheKafka reviews and mentions

Posts with mentions or reviews of ApacheKafka. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-06-01.
  • JR, quality Random Data from the Command line, part II
    3 projects | dev.to | 1 Jun 2023
    In the first part of this series, we have seen how to use JR in simple use cases to stream random data from predefined templates to standard out and Apache Kafka on Confluent Cloud.
  • Exploring Async PHP
    6 projects | dev.to | 31 May 2023
    The use of queues such as Amazon SQS, RabbitMQ or Apache Kafka has been a widely accepted solution for some time.
  • Best way to schedule events and handle them in the future?
    9 projects | /r/golang | 25 May 2023
    The second approach is to use a message queue, as some others have suggested. The most powerful of these is probably Kafka, but it's almost certainly overkill. (Technically, Kafka is an event log, not a message queue, but that's semantics at this point)
  • Top 6 message queues for distributed architectures
    4 projects | dev.to | 18 May 2023
    Apache Kafka is an open-source, distributed event streaming platform with message communication and storage capabilities. Although Kafka is not technically a message queue, it has the functionality of a message queue using a topic partition.
  • Amazon Ditches Microservices for Monolith: Decoding Prime Video's Architectural Shift
    5 projects | dev.to | 18 May 2023
    When it comes to the limitations of AWS Step Functions, let us look at what it was doing. Step Functions handled communication between the different steps of their stream quality architecture and error handling. When it comes to communication between services, tools like Kafka exist and can be used to transfer data (or state) between services. Kafka uses a pub/sub (publish and subscribe) messaging model that allows producers to publish topics to consumers, that can then act on the topics they are subscribed to. Kafka's pub/sub model allows for efficient and reliable data streaming, making it perfect for building event-driven systems, such as one that handles monitoring video quality.
  • HRV-Mart
    16 projects | dev.to | 8 May 2023
    In order to create a scalable back-end I use micro-service architecture. Current version of HRV-Mart back-end consist of Product-Microservice, User-Microservice, Auth-Microservice, Order-Microservice, Cart-Microservice, Like-Micorservice and API-Gateway. Above micro-services are loosely couple and communication between them happens via Apache Kafka. In order to make them more secure, I added unit tests. The master branch is protected via branch protection rules
  • JR, quality Random Data from the Command line, part I
    8 projects | dev.to | 7 May 2023
    So, is JR yet another faking library written in Go? Yes and no. JR indeed implements most of the APIs in fakerjs and Go fake it, but it's also able to stream data directly to stdout, Kafka, Redis and more (Elastic and MongoDB coming). JR can talk directly to Confluent Schema Registry, manage json-schema and Avro schemas, easily maintain coherence and referential integrity. If you need more than what is OOTB in JR, you can also easily pipe your data streams to other cli tools like kcat thanks to its flexibility.
  • Querying microservices in real-time with materialized views
    4 projects | dev.to | 30 Apr 2023
    RisingWave is an open-source streaming database that has built-in fully-managed CDC source connectors for various databases, also it can collect data from other sources such Kafka, Pulsar, Kinesis, or Redpanda and it allows you to query real-time streams using SQL. You can get a materialized view that is always up-to-date.
  • Modern stack to build a real-time event-driven app
    3 projects | dev.to | 18 Apr 2023
    The first component is a database that acts as a data source, which can be PostgreSQL (Other popular options include MongoDB or MySQL). As data changes in the database, a change is detected using the Log-based CDC (Change Data Capture) capabilities of the database. It captures the change and records it in a transaction log. The captured changes are then transformed into a change event that can be consumed in real-time by downstream systems (a message broker) such as Kafka.
  • How Change Data Capture (CDC) Works with Streaming Database
    5 projects | dev.to | 7 Apr 2023
    A streaming database is a type of database that is designed to handle continuous data streams in real-time and makes it possible to query this data. You can read more about how a Streaming database differs from a Traditional database and how to choose the right streaming database in my other blog posts. CDC is particularly useful when working with streaming databases, you can ingest CDC data from directly databases (See an example in the next section) without setting up additional services like Kafka.
  • A note from our sponsor - SonarLint
    www.sonarlint.org | 5 Jun 2023
    Up your coding game and discover issues early. SonarLint is a free plugin that helps you find & fix bugs and security issues from the moment you start writing code. Install from your favorite IDE marketplace today. Learn more →