Open zp245491220 opened 1 year ago
Did you add the settings by referring to the guide below?
Please also share the log library name and version you use.
me too
version:2.3.5
<appender name="KAFKA-EVENTS" class="com.github.danielwegener.logback.kafka.KafkaAppender">
<encoder charset="UTF-8" class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<pattern>
<pattern>
{
<!--打印时间-->
"logtime": "%date{yyyy-MM-dd HH:mm:ss.SSS}",
<!--服务名称-->
"app_name": "${applicationName}",
<!--部署环境-->
"env": "${kafkaEnv}",
<!--线程名-->
"thread": "%thread",
<!--日志级别-->
"level": "%level",
<!--全限定类名-->
"class": "%logger",
<!--类中的哪个方法-->
"method": "%method",
<!--类中的第几行-->
"line": "%line",
<!--trace id-->
"PtxId": "[TxId : %X{PtxId} , SpanId : %X{PspanId}]",
<!--日志打印的信息-->
"message": "%message",
<!--堆栈异常信息-->
"stack_trace":"%xEx"
}
</pattern>
</pattern>
</providers>
</encoder>
<topic>${kafkaTopic}</topic>
<keyingStrategy class="com.github.danielwegener.logback.kafka.keying.NoKeyKeyingStrategy"/>
<deliveryStrategy class="com.github.danielwegener.logback.kafka.delivery.AsynchronousDeliveryStrategy"/>
<producerConfig>bootstrap.servers=${kafkaBootstrap}</producerConfig>
<producerConfig>acks=0</producerConfig>
<producerConfig>linger.ms=100</producerConfig>
<producerConfig>max.block.ms=100</producerConfig>
<producerConfig>client.id=${HOSTNAME}-${CONTEXT_NAME}-logback-relaxed</producerConfig>
<producerConfig>retries=1</producerConfig>
<producerConfig>batch-size=16384</producerConfig>
<producerConfig>buffer-memory=33554432</producerConfig>
<producerConfig>properties.max.request.size==2097152</producerConfig>
<producerConfig>sasl.mechanism=${kafkaSaslMechanism}</producerConfig>
<producerConfig>security.protocol=SASL_PLAINTEXT</producerConfig>
<producerConfig>sasl.jaas.config=${kafkaJaasConfig}</producerConfig>
</appender>
profiler.logback.logging.transactioninfo=true
@CKReliancd The appnder(com.github.danielwegener.logback.kafka.KafkaAppender) you are using is not the appender provided by logback, but a kafkaappender created by someone else. That's why profiling doesn't work.
https://github.com/danielwegener/logback-kafka-appender/tree/master
For reference, the logback plugin changes the ch.qos.logback.classic.spi.LoggingEvent object. I think you need to change kafkaAppender to use the LoggingEvent object as the appender provided by logback to enable profiling.
Alternatively, you could develop your own plugin. Another option is to develop a way to store the transactionId in the MDC in a suitable location. See the interceptor below. https://github.com/pinpoint-apm/pinpoint/blob/master/agent-module/plugins/logback/src/main/java/com/navercorp/pinpoint/plugin/logback/interceptor/LoggingEventOfLogbackInterceptor.java
pinpoint version:2.3.4
If I don't log anything in my project, then I will get a null value for ptxId from the aspect. However, if I log something, I can get the value. What could be the reason for this?