Closed syusuke closed 6 years ago
What are you using pump the messages from kafka to elasticsearch(?). Looks like the "importer" (or however it is called) reads the kafka message as a string (and therefore message payload) rather than interpreting it as a json encoded logback message.
logback -> kafka -> logstash.
This JSON come from logstash
Console log.
stdout {
codec => rubydebug
}
My JavaCode
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import static net.logstash.logback.marker.Markers.*;
......
Logger logger = LoggerFactory.getLogger(TestLogBack.class);
logger.info(append("name1", "value1"), "test");
A similar configuration. When use LogstashTcpSocketAppender
<appender name="TCP" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>192.168.10.126:9250</destination>
<encoder class="net.logstash.logback.encoder.LogstashEncoder">
<mdc/>
<context/>
<message/>
</encoder>
</appender>
{
"@timestamp" => 2017-11-14T22:50:23.903Z,
"level" => "INFO",
"port" => 34822,
"thread_name" => "main",
"level_value" => 20000,
"@version" => 1,
"host" => "192.168.10.174",
"logger_name" => "com.lvkerry.test.TestLogBack",
"message" => "test",
"name1" => "value1"
}
This json format is what I expected.
I assume you use logstash's pluigins-input-kafka.
It would help if you'd provide the relevant parts of your logstash config. logback-kafka-appender does exactly what it is supposed to do. It encodes the LogEvent
using the given LogstashEncoder
as the message payload of a kafka message.
You must configure logstash in a way that it interpret's the payload of the message as a json containing the log entries fields. Have you tried setting the plugins property codec => "json"
?
It is my mistake. my configuration problem Thank you very much.
Hi. I use tag
issue-51-logback12-encoders
Why all the data in the
message
? -Thanks