kafkacat
logstash-logback-encoder
Our great sponsors
kafkacat | logstash-logback-encoder | |
---|---|---|
8 | 8 | |
3,573 | 2,386 | |
- | 1.2% | |
7.3 | 5.2 | |
over 2 years ago | 20 days ago | |
C | Java | |
GNU General Public License v3.0 or later | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
kafkacat
-
Build a data ingestion pipeline using Kafka, Flink, and CrateDB
To communicate with Kafka, you can use Kafkacat, a command-line tool that allows to produce and consume Kafka messages using a very simple syntax. It also allows you to view the topics' metadata.
-
Event Streaming Like it's 1978
Feels like you could get pretty far with kafkacat and a SQLite database.
- ZooKeeper-free Kafka is out. First Demo
-
Kafcat 0.1.1 release -- a cat for kafka
This is the second release version of Kafcat. Kafcat is a Rust fully async rewrite of kafkacat.
- Primeiros passos com Kafka - Parte 2
-
Spring Cloud Sleuth in action
Consume from the Kafka topic my.topic with kafkacat:
-
5 Things Every Apache Kafka Developer Should Know
From the code above, you can see that to process the headers, simply use the ConsumerRecord.headers() method to return the headers. In our example above, we’re printing the headers out to the console for demonstration purposes. Once you have access to the headers, you can process them as needed. For reading headers from the command line, KIP-431 adds support for optionally printing headers from the ConsoleConsumer, which will be available in the Apache Kafka 2.7.0 release.You can also use kafkacat to view headers from the command line. Here’s an example command:
-
Streaming data into Kafka S01/E04 — Loading Log files using Grok Expression
Note: In the example above, we have used kafkacat to consume the topics. The option -o-1 is used to only consume the latest message
logstash-logback-encoder
- Tracing: Structured Logging, but better in every way
-
Do you have a guideline on logging
I use the logstash json format.
-
How to do JSON logging in Scala?
We're using https://github.com/logfellow/logstash-logback-encoder with logback (on Play Framework, but should work fine on Lambda as well).
-
JSON logging for JSON REST services vs performance
For those interested in the details, I've created an example implementation based on Spring-flavoured REST and Logbook+logstash-logback-encoder within my own json-log-filter project for PoC / reference.
-
Echopraxia, a better Java Logging API
what's the difference to https://github.com/logfellow/logstash-logback-encoder ??
-
Is it reasonable to transform log4jlogs in via a configuration file?
Don't use filebeat. Filebeat is for systems that you cannot change logging for. Push logs directly to logstash via logstash appender. Since I'm mainly logback user, there's one directly by logstash at https://github.com/logstash/logstash-logback-encoder. Quick search indicates that there's https://github.com/viskan/logstash-appender/ for log4j also and it seems it also supports MDC abuse as indicated by https://github.com/viskan/logstash-appender/blob/master/src/main/java/com/viskan/log4j/logstash/appender/LogstashAppender.java#L256. By abusing the MDC you won't need to write a processing pattern in logstash to extract metadata from giant blob line as each key in MDC will get assigned additional value, making your records in elastic search more useful.
-
Java Spring Application logging to WS endpoint
You can use the Logstash Logback encooder. You mentioned Elk, so there must be a Logstash running somewhere you can connect to with this appender
-
Spring Cloud Sleuth in action
We need to add traceId and spanId values to the application log. In production we would use the logstash-logback-encoder to generate logs in JSON format and send them to an ELK but for the demo we use this plain text logback layout:
What are some alternatives?
Docker Compose - Define and run multi-container applications with Docker
spring-cloud-sleuth-in-action - 🍀 Spring Cloud Sleuth in Action
redpanda - Redpanda is a streaming data platform for developers. Kafka API compatible. 10x faster. No ZooKeeper. No JVM!
logback-gelf - Logback appender for sending GELF messages with zero additional dependencies.
kafcat - a rust port of kafkacat
zipkin - Zipkin is a distributed tracing system
Apache Kafka - Mirror of Apache Kafka
logstash-appender - A log4j appender that sends raw JSON directly to Logstash
jetstream - JetStream Utilities
logback-android - 📄The reliable, generic, fast and flexible logging framework for Android
java-pubsublite-kafka
json-log-filter - World's fastest JSON filter for the JVM