librdkafka
kafka-python
Our great sponsors
librdkafka | kafka-python | |
---|---|---|
18 | 8 | |
7,292 | 5,484 | |
1.2% | - | |
8.3 | 6.4 | |
4 days ago | 11 days ago | |
C | Python | |
GNU General Public License v3.0 or later | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
librdkafka
-
Do you use Rust in your professional career?
recent PR: https://github.com/confluentinc/librdkafka/pull/4275
-
JR, quality Random Data from the Command line, part I
# Kafka configuration # https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md bootstrap.servers= security.protocol=SASL_SSL sasl.mechanisms=PLAIN sasl.username= sasl.password= compression.type=gzip compression.level=9 statistics.interval.ms=1000
-
A Critical Detail about Kafka Partitioners
But what about Kafka producer clients in other languages? The excellent librdkafka project is a C/C++ implementation of Kafka clients and is widely used for non-JVM Kafka applications. Additionally, Kafka clients in other languages (Python, C#) build on top of it. The default partitioner for librdkafka uses the CRC32 hash function to get the correct partition for a key.
-
Horizontally scaling Kafka consumers with rendezvous hashing
We could have made some changes at the librdkafka level (see this), but we didn’t really want to pursue this (at least not yet).
-
Events with same key going to different partitions
You want records with the same key to always land on the same partition, so you need all the clients to use the same hashing algorithm. The easiest way to do that is to make sure the librdkafka client uses the java compatible murmur2_random hash algorithm. See “Partitioner” section here: https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md
-
Getting sum type values from a map
As my first "real world" (ish) project in Vlang, I'm trying to copy https://github.com/confluentinc/confluent-kafka-go, which is a Go wrapper for Kafka C client library, https://github.com/edenhill/librdkafka
-
Installing node-rdkafka on M1 for use with SASL
If you're using Kafka in a Node.js app, it's likely that you'll need node-rdkafka. This is a library that wraps the librdkafka library and makes it available in Node.js. According to the project's README, "All the complexity of balancing writes across partitions and managing (possibly ever-changing) brokers should be encapsulated in the library."
-
Introduction to Key Apache KafkaⓇ Concepts
# Parse the configuration. # See https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md config_parser = ConfigParser() config_parser.read_file(args.config_file) config = dict(config_parser['default']) # Create Producer instance producer = Producer(config)
-
video analytics on edge
• git clone https://github.com/edenhill/librdkafka.git
- librdkafka - the Apache Kafka C/C++ client library
kafka-python
-
kafka-python VS quix-streams - a user suggested alternative
2 projects | 7 Dec 2023
-
quix-streams VS kafka-python - a user suggested alternative
2 projects | 7 Dec 2023
Kafka-python is a producer-consumer library for one message at a time applications. Quix Streams is a Python stream processing library for ML and AI applications. Use them together in your event streaming architecture.
-
Improving Kafka interfaces
kafka-python - https://github.com/dpkp/kafka-python
-
Monitor Kafka Producer and Consumer Metrics using Prometheus
If you're using kafka-python take a look at it's sourcecode.
-
Transition from RPA to "traditional" programming role
contribute to opensource. companies care about distributed systems at the moment, so if you can contribute to something like kafka https://kafka.apache.org/project or a popular wrapper https://github.com/dpkp/kafka-python, or a distributed systems platform https://github.com/redpanda-data/redpanda, that would be a strong signal that you can provide value and work with others as you deliver code.
-
New to kafka..
Eg this https://kafka-python.readthedocs.io/
-
Using Kafka with Python... is Confluent the only option?
Related issue thread - https://github.com/dpkp/kafka-python/issues/2290
-
Librdkafka – the Apache Kafka C/C++ client library
I've been working on a pure-Racket Kafka client[1] off and on (for fun) since February and it's a good amount of work. When the official protocol docs and Wireshark fail me, I usually look at librdkafka and kafka-python[2] to figure out how things are supposed to fit together. Kudos to the authors of both libraries for writing code that's easy to follow!
[1]: https://defn.io/2022/03/12/ann-racket-kafka/
[2]: https://github.com/dpkp/kafka-python
What are some alternatives?
CVE-2022-27254 - PoC for vulnerability in Honda's Remote Keyless System(CVE-2022-27254)
redis-py - Redis Python client
sarama - Sarama is a Go library for Apache Kafka. [Moved to: https://github.com/IBM/sarama]
PyMongo - MongoDB Ecosystem Documentation
Karafka - Ruby and Rails efficient multithreaded Kafka processing framework
py2neo - Py2neo is a comprehensive toolkit for working with Neo4j from within Python applications or from the command line.
kafka-go - Kafka library in Go
kcat - Generic command line non-JVM Apache Kafka producer and consumer
rsyslog - a Rocket-fast SYStem for LOG processing
HappyBase - A developer-friendly Python library to interact with Apache HBase
rust-kafka-101 - Getting started with Rust and Kafka
Plyvel - Plyvel, a fast and feature-rich Python interface to LevelDB