pinpoint-apm / pinpoint

APM, (Application Performance Management) tool for large-scale distributed systems.
https://pinpoint-apm.gitbook.io/
Apache License 2.0
13.42k stars 3.76k forks source link

logback log ptxId #10142

Open zp245491220 opened 1 year ago

zp245491220 commented 1 year ago

pinpoint version:2.3.4

If I don't log anything in my project, then I will get a null value for ptxId from the aspect. However, if I log something, I can get the value. What could be the reason for this?

minwoo-jung commented 1 year ago

Did you add the settings by referring to the guide below?

Please also share the log library name and version you use.

CKReliancd commented 1 week ago

me too

CKReliancd commented 1 week ago

version:2.3.5

CKReliancd commented 1 week ago
        <appender name="KAFKA-EVENTS" class="com.github.danielwegener.logback.kafka.KafkaAppender">
            <encoder charset="UTF-8" class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
                <providers>
                    <pattern>
                        <pattern>
                            {
                            <!--打印时间-->
                            "logtime": "%date{yyyy-MM-dd HH:mm:ss.SSS}",
                            <!--服务名称-->
                            "app_name": "${applicationName}",
                            <!--部署环境-->
                            "env": "${kafkaEnv}",
                            <!--线程名-->
                            "thread": "%thread",
                            <!--日志级别-->
                            "level": "%level",
                            <!--全限定类名-->
                            "class": "%logger",
                            <!--类中的哪个方法-->
                            "method": "%method",
                            <!--类中的第几行-->
                            "line": "%line",
                            <!--trace id-->
                            "PtxId": "[TxId : %X{PtxId} , SpanId : %X{PspanId}]",
                            <!--日志打印的信息-->
                            "message": "%message",
                            <!--堆栈异常信息-->
                            "stack_trace":"%xEx"
                            }
                        </pattern>
                    </pattern>
                </providers>
            </encoder>
            <topic>${kafkaTopic}</topic>
            <keyingStrategy class="com.github.danielwegener.logback.kafka.keying.NoKeyKeyingStrategy"/>
            <deliveryStrategy class="com.github.danielwegener.logback.kafka.delivery.AsynchronousDeliveryStrategy"/>
            <producerConfig>bootstrap.servers=${kafkaBootstrap}</producerConfig>
            <producerConfig>acks=0</producerConfig>
            <producerConfig>linger.ms=100</producerConfig>
            <producerConfig>max.block.ms=100</producerConfig>
            <producerConfig>client.id=${HOSTNAME}-${CONTEXT_NAME}-logback-relaxed</producerConfig>
            <producerConfig>retries=1</producerConfig>
            <producerConfig>batch-size=16384</producerConfig>
            <producerConfig>buffer-memory=33554432</producerConfig>
            <producerConfig>properties.max.request.size==2097152</producerConfig>
            <producerConfig>sasl.mechanism=${kafkaSaslMechanism}</producerConfig>
            <producerConfig>security.protocol=SASL_PLAINTEXT</producerConfig>
            <producerConfig>sasl.jaas.config=${kafkaJaasConfig}</producerConfig>
        </appender>
CKReliancd commented 1 week ago

logback (guide url : https://github.com/naver/pinpoint/blob/master/doc/per-request_feature_guide.md)

profiler.logback.logging.transactioninfo=true

variables/aliases: https://github.com/qos-ch/logback/blob/master/logback-classic/src/main/java/ch/qos/logback/classic/PatternLayout.java#L54-L149

profiler.logback.logging.pattern.replace.enable=false

profiler.logback.logging.pattern.replace.search=%message,%msg,%m

profiler.logback.logging.pattern.replace.with="TxId:%X{PtxId} %msg"

minwoo-jung commented 6 days ago

@CKReliancd The appnder(com.github.danielwegener.logback.kafka.KafkaAppender) you are using is not the appender provided by logback, but a kafkaappender created by someone else. That's why profiling doesn't work.

https://github.com/danielwegener/logback-kafka-appender/tree/master

For reference, the logback plugin changes the ch.qos.logback.classic.spi.LoggingEvent object. I think you need to change kafkaAppender to use the LoggingEvent object as the appender provided by logback to enable profiling.

Alternatively, you could develop your own plugin. Another option is to develop a way to store the transactionId in the MDC in a suitable location. See the interceptor below. https://github.com/pinpoint-apm/pinpoint/blob/master/agent-module/plugins/logback/src/main/java/com/navercorp/pinpoint/plugin/logback/interceptor/LoggingEventOfLogbackInterceptor.java