librdkafka
gofakeit
Our great sponsors
librdkafka | gofakeit | |
---|---|---|
18 | 10 | |
7,292 | 4,206 | |
1.2% | - | |
8.3 | 9.5 | |
4 days ago | about 1 month ago | |
C | Go | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
librdkafka
-
Do you use Rust in your professional career?
recent PR: https://github.com/confluentinc/librdkafka/pull/4275
-
JR, quality Random Data from the Command line, part I
# Kafka configuration # https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md bootstrap.servers= security.protocol=SASL_SSL sasl.mechanisms=PLAIN sasl.username= sasl.password= compression.type=gzip compression.level=9 statistics.interval.ms=1000
-
A Critical Detail about Kafka Partitioners
But what about Kafka producer clients in other languages? The excellent librdkafka project is a C/C++ implementation of Kafka clients and is widely used for non-JVM Kafka applications. Additionally, Kafka clients in other languages (Python, C#) build on top of it. The default partitioner for librdkafka uses the CRC32 hash function to get the correct partition for a key.
-
Horizontally scaling Kafka consumers with rendezvous hashing
We could have made some changes at the librdkafka level (see this), but we didn’t really want to pursue this (at least not yet).
-
Events with same key going to different partitions
You want records with the same key to always land on the same partition, so you need all the clients to use the same hashing algorithm. The easiest way to do that is to make sure the librdkafka client uses the java compatible murmur2_random hash algorithm. See “Partitioner” section here: https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md
-
Getting sum type values from a map
As my first "real world" (ish) project in Vlang, I'm trying to copy https://github.com/confluentinc/confluent-kafka-go, which is a Go wrapper for Kafka C client library, https://github.com/edenhill/librdkafka
-
Installing node-rdkafka on M1 for use with SASL
If you're using Kafka in a Node.js app, it's likely that you'll need node-rdkafka. This is a library that wraps the librdkafka library and makes it available in Node.js. According to the project's README, "All the complexity of balancing writes across partitions and managing (possibly ever-changing) brokers should be encapsulated in the library."
-
Introduction to Key Apache KafkaⓇ Concepts
# Parse the configuration. # See https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md config_parser = ConfigParser() config_parser.read_file(args.config_file) config = dict(config_parser['default']) # Create Producer instance producer = Producer(config)
-
video analytics on edge
• git clone https://github.com/edenhill/librdkafka.git
- librdkafka - the Apache Kafka C/C++ client library
gofakeit
- I've made my first PR.
-
Show HN: Buyidentities.com
I have to admit that I fell into a rabbit hole, and I noticed that popular tools like fakerjs or gofakeit[0] did not meet my needs.
I needed to generate realistic-looking identities; the person's photo must match the gender, same for the age, the skin color of the person must correspond with the origin of the surname, the first name should be common in the targeted country, and the residential address must be real, among other things.
You would not use this for test data btw, a common use case for this would be for marketing or spamming operations where you need realistic data. My consciense does not accept the later however ;-)
[0] https://github.com/brianvoe/gofakeit
- Gofakeit New Functions!
-
dg - a fast relational data generator
Thank you! No, it’s just random data (here’s a link: https://github.com/brianvoe/gofakeit/blob/master/data/address.go)
-
JR, quality Random Data from the Command line, part I
So, is JR yet another faking library written in Go? Yes and no. JR indeed implements most of the APIs in fakerjs and Go fake it, but it's also able to stream data directly to stdout, Kafka, Redis and more (Elastic and MongoDB coming). JR can talk directly to Confluent Schema Registry, manage json-schema and Avro schemas, easily maintain coherence and referential integrity. If you need more than what is OOTB in JR, you can also easily pipe your data streams to other cli tools like kcat thanks to its flexibility.
-
TIL: panic(spew.Sdump(myVar))
Tangentially related, but there is a package out there called go-fakeit github.com/brianvoe/gofakeit.git for generating random data, which doesn't sound like it entirely maps with what you're doing, but there may be some overlap.
-
Ask HN: What is the most impactful thing you've ever built?
Its not much but I have had success with a random data generator package for golang called https://github.com/brianvoe/gofakeit. Its not live changing but hopefully it helps out enough developers.
- LGPD e falsear dados sensíveis no banco de dados - parte 2
-
Creating a PDF With Go, Maroto & Gofakeit
Using mock data is a great way to speed up the prototyping process. We will use the GoFakeIt package to create a little dummy data generator to insert into our PDF.
-
Gofakeit v6. Now supports concurrency and crypto/rand
The new v6 release now supports the ability to have localized rand that allows for concurrent generating of random data. Crypto/rand has now been integrated as well as if you have a custom rand that fulfills the math/rand source64 interface you can use it with gofakeit. Let me know your thoughts.
https://github.com/brianvoe/gofakeit
What are some alternatives?
CVE-2022-27254 - PoC for vulnerability in Honda's Remote Keyless System(CVE-2022-27254)
bitio - Optimized bit-level Reader and Writer for Go.
sarama - Sarama is a Go library for Apache Kafka. [Moved to: https://github.com/IBM/sarama]
conv - Fast conversions across various Go types with a simple API.
Karafka - Ruby and Rails efficient multithreaded Kafka processing framework
uuid - Generate, encode, and decode UUIDs v1 with fast or cryptographic-quality random node identifier.
kafka-go - Kafka library in Go
browscap_go - GoLang Library for Browser Capabilities Project
rsyslog - a Rocket-fast SYStem for LOG processing
autoflags - Populate go command line app flags from config struct
rust-kafka-101 - Getting started with Rust and Kafka
base64Captcha - captcha of base64 image string