gofakeit
schema-registry
Our great sponsors
gofakeit | schema-registry | |
---|---|---|
10 | 7 | |
4,159 | 2,121 | |
- | 1.4% | |
9.5 | 10.0 | |
about 20 hours ago | 6 days ago | |
Go | Java | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gofakeit
-
Show HN: Buyidentities.com
I have to admit that I fell into a rabbit hole, and I noticed that popular tools like fakerjs or gofakeit[0] did not meet my needs.
I needed to generate realistic-looking identities; the person's photo must match the gender, same for the age, the skin color of the person must correspond with the origin of the surname, the first name should be common in the targeted country, and the residential address must be real, among other things.
You would not use this for test data btw, a common use case for this would be for marketing or spamming operations where you need realistic data. My consciense does not accept the later however ;-)
-
dg - a fast relational data generator
Thank you! No, it’s just random data (here’s a link: https://github.com/brianvoe/gofakeit/blob/master/data/address.go)
-
JR, quality Random Data from the Command line, part I
So, is JR yet another faking library written in Go? Yes and no. JR indeed implements most of the APIs in fakerjs and Go fake it, but it's also able to stream data directly to stdout, Kafka, Redis and more (Elastic and MongoDB coming). JR can talk directly to Confluent Schema Registry, manage json-schema and Avro schemas, easily maintain coherence and referential integrity. If you need more than what is OOTB in JR, you can also easily pipe your data streams to other cli tools like kcat thanks to its flexibility.
-
TIL: panic(spew.Sdump(myVar))
Tangentially related, but there is a package out there called go-fakeit github.com/brianvoe/gofakeit.git for generating random data, which doesn't sound like it entirely maps with what you're doing, but there may be some overlap.
-
Ask HN: What is the most impactful thing you've ever built?
Its not much but I have had success with a random data generator package for golang called https://github.com/brianvoe/gofakeit. Its not live changing but hopefully it helps out enough developers.
- LGPD e falsear dados sensíveis no banco de dados - parte 2
-
Creating a PDF With Go, Maroto & Gofakeit
Using mock data is a great way to speed up the prototyping process. We will use the GoFakeIt package to create a little dummy data generator to insert into our PDF.
schema-registry
-
JR, quality Random Data from the Command line, part I
So, is JR yet another faking library written in Go? Yes and no. JR indeed implements most of the APIs in fakerjs and Go fake it, but it's also able to stream data directly to stdout, Kafka, Redis and more (Elastic and MongoDB coming). JR can talk directly to Confluent Schema Registry, manage json-schema and Avro schemas, easily maintain coherence and referential integrity. If you need more than what is OOTB in JR, you can also easily pipe your data streams to other cli tools like kcat thanks to its flexibility.
-
Testing a Kafka consumer with Avro schema messages in your Spring Boot application with Testcontainers
So that means we can configure the Kafka producer and consumer with an imaginary schema registry url, that only needs to start with “mock://” and you automatically get to work with the MockSchemaRegistryClient. This way you don't need to explicitly initiate the MockSchemaRegistryClient and configure everything accordingly. That also eradicates the need for the Confluent Schema Registry Container. Running the Kafka Testcontainer with the embedded Zookeeper, we no longer need an extra Zookeeper container and we are down to one Testcontainer for the messaging. This way I ended up with only two Testcontainers: Kafka and the database.
-
confluent Schema Registry and Rust
Confluent is a company founded by the creators of Apache Kafka. They are providing the Confluent Platform which consists of several components, all based on Kafka. The license for these components vary. The Schema Registry has the community-license, which basically means it's free to use as long as you don't offer the Schema Registry itself as a SaaS solution. The source code can be found on Github.
-
An Overview About the Different Kafka Connect Plugins
Schema Registry from Confluent (GitHub) => http://localhost:8081/
What are some alternatives?
kafka-ui - Open-Source Web UI for Apache Kafka Management
kafdrop - Kafka Web UI
schema-registry-gitops - Manage Confluent Schema Registry subjects through Infrastructure as code
rust-rdkafka - A fully asynchronous, futures-based Kafka client library for Rust based on librdkafka
kafka-avro-without-registry - Test Spring Kafka application (using Avro as a serialization mechanism) without the need for Confluent Schema Registry
bitio - Optimized bit-level Reader and Writer for Go.
Protobuf - Protocol Buffers - Google's data interchange format
conv - Fast conversions across various Go types with a simple API.
uuid - Generate, encode, and decode UUIDs v1 with fast or cryptographic-quality random node identifier.
browscap_go - GoLang Library for Browser Capabilities Project
autoflags - Populate go command line app flags from config struct
base64Captcha - captcha of base64 image string