danielwegener / logback-kafka-appender

Logback appender for Apache Kafka
Apache License 2.0
644 stars 262 forks source link

use LogstashEncoder. why all data is included in the message field? #54

Closed syusuke closed 6 years ago

syusuke commented 7 years ago

Hi. I use tag issue-51-logback12-encoders

<appender name="myKafkaAppender" class="com.github.danielwegener.logback.kafka.KafkaAppender">
        <topic>gateway</topic>
        <keyingStrategy class="com.github.danielwegener.logback.kafka.keying.RoundRobinKeyingStrategy"/>
        <deliveryStrategy class="com.github.danielwegener.logback.kafka.delivery.AsynchronousDeliveryStrategy"/>
        <producerConfig>bootstrap.servers=192.168.10.126:9092</producerConfig>
        <encoder class="net.logstash.logback.encoder.LogstashEncoder">
                <mdc/>
                <context/>
                <message/>
        </encoder>
    </appender>
{
      "message" => "{\"@timestamp\":\"2017-11-14T19:25:28.700+08:00\",\"@version\":1,\"message\":\"test\",\"logger_name\":\"com.lvkerry.test.TestLogBack\",\"thread_name\":\"main\",\"level\":\"INFO\",\"level_value\":20000,\"name1\":\"value1\"}\r\n",
     "@version" => "1",
    "@timestamp" => 2017-11-14T11:24:46.147Z
}

Why all the data in the message? -Thanks

danielwegener commented 7 years ago

What are you using pump the messages from kafka to elasticsearch(?). Looks like the "importer" (or however it is called) reads the kafka message as a string (and therefore message payload) rather than interpreting it as a json encoded logback message.

syusuke commented 7 years ago

logback -> kafka -> logstash. This JSON come from logstash Console log.

stdout {
    codec => rubydebug
}

My JavaCode

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import static net.logstash.logback.marker.Markers.*;
......
Logger logger = LoggerFactory.getLogger(TestLogBack.class);
logger.info(append("name1", "value1"), "test");

A similar configuration. When use LogstashTcpSocketAppender

<appender name="TCP" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
    <destination>192.168.10.126:9250</destination>
    <encoder class="net.logstash.logback.encoder.LogstashEncoder">
        <mdc/>
        <context/>
        <message/>
    </encoder>
</appender>
{
     "@timestamp" => 2017-11-14T22:50:23.903Z,
          "level" => "INFO",
           "port" => 34822,
    "thread_name" => "main",
    "level_value" => 20000,
       "@version" => 1,
           "host" => "192.168.10.174",
    "logger_name" => "com.lvkerry.test.TestLogBack",
        "message" => "test",
          "name1" => "value1"
}

This json format is what I expected.

danielwegener commented 6 years ago

I assume you use logstash's pluigins-input-kafka.

It would help if you'd provide the relevant parts of your logstash config. logback-kafka-appender does exactly what it is supposed to do. It encodes the LogEventusing the given LogstashEncoder as the message payload of a kafka message.

You must configure logstash in a way that it interpret's the payload of the message as a json containing the log entries fields. Have you tried setting the plugins property codec => "json"?

syusuke commented 6 years ago

It is my mistake. my configuration problem Thank you very much.