Open andreaturli opened 5 years ago
I would just like to add some context to illustrate why this issue is important.
Logback is the default logging implementation for the popular Spring Boot framework. ecs-logging-java
for Logback does not appear to have much support for structured logging of arbitrary fields. Indeed the entire encoder has very limited configurability.
Log statements such as
log.debug("List of files: {}.", localFiles.toString());
log.info("Checking {} prefixes in S3.", prefixesToCheck.size());
Simply result in a formatted message
property which is illustrated below. This extra information must be parsed out from message
to be of any use which resists the purpose of using a JSON-formatted log event to begin with.
{
"message": "List of files: [file1.txt]"
}
{
"message": "Checking 5 prefixes in S3."
}
If you look at how this encoder serializes the JSON, you will see that the JSON structure is pretty rigid. The only variable aspect is via additionalFields
which you have to statically pre-configure
<additionalField>
<key>key1</key>
<value>value1</value>
</additionalField>
<additionalField>
<key>key2</key>
<value>value2</value>
</additionalField>
It is possible to try to leverage some of the underlying SLF4j components for LoggingEventBuilder
(javadoc)
Example
int newT = 15;
int oldT = 16;
// using classical API
logger.debug("oldT={} newT={} Temperature changed.", newT, oldT);
// using fluent API
logger.atDebug()
.addKeyValue("oldT", oldT)
.addKeyValue("newT", newT)
.log("Temperature changed.");
But it is believed that this similarly just modifies the message
property
With logback-logstash-encoder
one could write structured log statements like this
log.debug("List of files: {}.", kv("namespace.file_list", localFiles);
log.info("Checking {} prefixes in S3.", kv("namespace.s3_prefixes_to_check", prefixesToCheck.size()));
Which results in JSON like this
{
"message": "List of files: namespace.file_list=file1.txt",
"namespace.file_list": ["file1.txt"]
}
{
"message": "Checking namespace.s3_prefixes_to_check=5 prefixes in S3.",
"namespace.s3_prefixes_to_check": 5
}
So now we can easily monitor these logged fields such as namespace.s3_prefixes_to_check
and search/filter for it, aggregate it, visualize it, alert on it, run anomaly detection for it, and so on.
ecs-logging-java
for Log4j2 on the other hand does appear to have something similar described here
By leveraging log4j2's MapMessage or even by implementing your own MultiformatMessage with JSON support, you can add additional fields to the resulting JSON
So it should be possible to do something like
logger.info(ParameterizedStructuredMessage.of("hello {}", "world").with("foo", "bar"));
But it it clearly is not as convenient and no one will want to switch to Log4j just for this.
logback-logstash-encoder
just for some limited ECS-standardization and automatic APM integration.logback-spring.xml
setup correctly. The syntax is convenient for developers to write their log statements fairly naturally.NOTE: When using structured logging for arbitrary JSON fields, be mindful of Mapping Explosion
I'd be happy to support LoggingEventBuilder
once slf4j 2.x is released but so far, there has only been an alpha release around 1.5 years ago.
In version 1.x, Logback just doesn't provide good out of the box support for structured logging and I'm a bit hesitant to re-implement the workaround logback-logstash-encoder created. But maybe we could add support for their kv
markers. It would mean you need a dependency on both logback-logstash-encoder and ecs-logging-java, though.
But it is believed that this similarly just modifies the message property
http://www.slf4j.org/api/org/slf4j/event/LoggingEvent.html has a method: List<KeyValuePair> getKeyValuePairs()
. So any implementing library will have info about structured items.
master
of Logback already has it: https://github.com/qos-ch/logback/commit/910bf99ddf9bf32a34a330fbfa07d6492aa74f6d It is in 1.3.0-alpha7.
Hello! Just wanted to ask if this topic is on any roadmap. Logback supports this since 1.3.0, released in August 28, 2022. This project compiles against Logback 1.2.8. Could this be updated to 1.3.0 and support for ch.qos.logback.classic.spi.ILoggingEvent#getKeyValuePairs
added?
My current workaround is this custom subclass:
public class KeyValuePairEcsEncoder extends EcsEncoder {
@Override
protected void addCustomFields(ILoggingEvent event, StringBuilder builder) {
List<KeyValuePair> keyValuePairs = event.getKeyValuePairs();
if (keyValuePairs == null || keyValuePairs.isEmpty()) {
return;
}
for (KeyValuePair keyValuePair : keyValuePairs) {
builder.append('\"');
JsonUtils.quoteAsString(keyValuePair.key, builder);
builder.append("\":\"");
JsonUtils.quoteAsString(toNullSafeString(String.valueOf(keyValuePair.value)), builder);
builder.append("\",");
}
}
}
Relatedly, unlike the classic API for logs where the Throwable
is passed as the last argument, using the Structured Format for logging hides the error from APM. This is a bit more annoying than the Key/Value issue as the Throwable
is altered by the API, so you have to discover it from the implemented ThrowableProxy
type, which is naturally undesirable.
log.atError().setCause(new RuntimeException("Test 1")).log("failed ...");
Has no error reported in the APM UI.
log.error("failed ...", new RuntimeException("Test 2"));
This error will be reported in the APM UI.
I was able to workaround this by adding to the addCustomFields
converter above to capture the Throwable
as it encodes it:
// instanceof example requires Java 21:
if (event.getThrowableProxy() instanceof ThrowableProxy throwableProxy) {
var span = ElasticApm.currentSpan();
if (span.isSampled()) {
span.captureException(throwableProxy.getThrowable());
} else {
ElasticApm.currentTransaction().captureException(throwableProxy.getThrowable());
}
}
I'd like to see support for structured logging as well. One of the requirements in the project I'm working on is to mask sensitive PII. This forces to me go back to logback-logstash-encoder
for its masking feature and manage ECS-standardization of fields myself. This workaround is not ideal in terms of maintainability.
As logback user, I'd like to have the ability to create structured logs as https://github.com/elastic/ecs-logging-java/tree/master/log4j2-ecs-layout#structured-logging
Currently the community relies on "net.logstash.logback.encoder.LogstashEncoder" to generate structured log statements which leverages
Markers
. Although not ideal, as Markers are intended for filtering: http://logback.qos.ch/manual/filters.html, I think it would be useful if the EcsEncoder could support this scenario without requiring an additional dependency library