Open overthetop opened 4 years ago
I was able to use Logback with the logstash encoder, just by adding the dependencies and removing the dependency for slf4j jboss logging.
However this breaks on dev-mode, i cant remove the dependency in there and the application starts with 2 slf4j bindings.
The freedom that logback+logstash encoder gives is amazing. I was able to have custom json logging but i had to hook into mdc to push a json object string and pass it around, very ugly.
@RogerGMartins you are right this is an option. Unfortunately after migrating some of the code to native images this is not possible anymore (compilation fails). There are a lot of fields that we want to remove from the logging message (hostName
, processName
,processId
, etc).
+1
@RogerGMartins could you please provide your work-around solution?
We are having a similar problem right now trying to inject some Datadog fields into our logs.
This is really important for me as well. At least Quarkus can support custom field instead of providing full support to Logstash. BTW, if it had, it would be amazing.
/cc @loicmathieu
If I understand it correctly, you want to send logs to ELK but not directly using our logging-gelf
library but by generating JSON logs in the console that obey to the logstash format.
And you are suggesting that:
logging-json
library didn't offer as much flexibility as needed.Maybe @dmlloyd can share some light on these issues ?
I think the idea of customizing the JSON output is perfectly reasonable. I believe the formatter code has some support for this already.
Any news about that?
We are also using quarkus in a Kubernetes environment and stream the logs to STDOUT. Currently we use quarkus-json-logging
but it is very restricting.
Structured Logging is much better supported with logstash-encoder (mentioned above) Nice usage examples can found here: https://www.innoq.com/en/blog/structured-logging/#structuredarguments
EDIT:
Is there a workaround to use slf4j directly? I tried but i could not make it work.
I removed quarkus-logging-json
and disabled console logging completely via quarkus.log.console.enable=false
.
I added the logback and logstash encoder dependencies. This caused a conflict because the Slf4j bindings were loaded twice. I excluded the bindings from quarkus-junit5
. Now everything compiles and the server runs but no logs are shown? Did i miss something?
+1
Structure logging is pretty much the standard now,any update on this issue?
+1
+1 Adding stuctured arguments in our json logs is mandatory for us too. If Quarkus native json logging can do that it would be nice, or let us use slf4j and logback...
+1 we have to be compliant with some log specs in my project to have a common monitoring in kibana. So please add this feature.
@gsmet looks like Quarkus should give the people what they want :) could this find it's way onto the roadmap? /cc @loicmathieu
Any news about that?
We are also using quarkus in a Kubernetes environment and stream the logs to STDOUT. Currently we use
quarkus-json-logging
but it is very restricting.Structured Logging is much better supported with logstash-encoder (mentioned above) Nice usage examples can found here: https://www.innoq.com/en/blog/structured-logging/#structuredarguments
EDIT:
Is there a workaround to use slf4j directly? I tried but i could not make it work. I removed
quarkus-logging-json
and disabled console logging completely viaquarkus.log.console.enable=false
. I added the logback and logstash encoder dependencies. This caused a conflict because the Slf4j bindings were loaded twice. I excluded the bindings fromquarkus-junit5
. Now everything compiles and the server runs but no logs are shown? Did i miss something?
I'm exactly at the same point.
The slf4j bindings are loaded twice and I can't get only one logger dependency to work independently
+1
One way of workaround it is that you replace slf4j-jboss-logging with your own bindings and bridges.
Gradle example (most interesting stuff is in build.gradle and of course logback.xml): https://github.com/asodja/quarkus-logstash-logging-example
You can start it and play with it with:
./gradlew quarkusDev
or
./gradlew build && java -jar build/quarkus-logstash-logging-example-my-version-runner.jar
For Maven you have to exclude slf4j-jboss-logging from every dependency (check dependency tree).
Note1: For quarkusDev I had to set logback.configurationFile
manually
Note2: This probably won't work in native images
Hi Guys,
I am currently doing an evaluation of quarkus framework for reactive routes . The framework looks promising except for this feature of providing support of custom logging fields which is quite easy in spring boot to configure using logback logstash encoder . Could quarkus team help us understand when the feature will be available in quarkus to do custom field logging ?
This feature is critical for us as our existing microservices logs are shipped to cloudwatch/splunk and we use a whole lot of custom fields for dashboard visualizations.
Our expectation would be the Slf4j Logging is stable with logback/logstash support for this feature.
We are looking for the planned roadmap version where this feature will be available. We are currently evaulating quarkus 1.6.
Regards, Harish
@harry9111985 if you previously use the logstash encoder, maybe the logging-gelf
extension would work ?
It can send logs in the GELF format to a central logging solution, it has been tested with logstash, graylog and fluentd.
It allows to add custom fields.
@loicmathieu : Is there an example I can refer to for populating custom fields in log statements ? We are looking at using structured arguments given by logback/logstash to append custom fields in log statements.
Currently we are using kv or keyValue static methods in our logging code to populate custom fields in our logs. Once we use these static methods in our log statement code then spring boot (in combination with logstash encoder in logback xml) does the rest in populating logs with custom fields
@harry9111985 with the loggin-gelf extension you can add hardcoded additional fields via
quarkus.log.handler.gelf.additional-field."field-name".value
quarkus.log.handler.gelf.additional-field."field-name".type
Or you can use the standard MDC fields mechanism to add fields to your log event, by adding the following configuration property
quarkus.log.handler.gelf.include-full-mdc=true
Please see the configuration reference for more information: https://quarkus.io/guides/centralized-log-management#configuration-reference
Thanks for the suggestion and Tried it @loicmathieu
Couple of things:
I am using org/slf4j/slf4j-api/1.7.30/slf4j-api-1.7.30.jar!/org/slf4j/MDC.class for performing MDC Logging.
a) The property quarkus.log.handler.gelf.include-full-mdc=true indeed works but our expectation is that the custom field should be present in the root level and not mdc level .
b) The MDC class doesn't have support of injecting objects which will be later converted to json payloads when seen in the centralized logging platform.
I tried using objectMapper and tried to convert the object to string and then append in MDC but it gets appended as String instead of JSON Payload.
We wouldn't be using the static approach as its meant for logging fields for incoming HTTP Requests.
The structured Logging in logstash satisfied both the things we need , first being we can have all the fields at the root level of the log message and second we can append objects in the logs which will be naturally converted to json payloads.
Could you help us understand when we can get logback-logstash support coupled with StructuredArguments in any of your future roadmap releases.
Attaching a sample of what we are looking for :
https://www.innoq.com/en/blog/structured-logging/#structuredarguments
Sample Log Message with Structured Arguments:
{
"@timestamp": "2020-08-21T15:44:27.645+10:00",
"@version": "1",
"message": "Request received for creating blah",
"logger_name": "some logger name",
"thread_name": "reactor-http-nio-3",
"level": "INFO",
"level_value": 20000,
"springAppName": "svc-sb-blah-blah-v2",
"traceId": "5f87ed85b82a627f",
"spanId": "5f87ed85b82a627f",
"spanExportable": "true",
"X-Span-Export": "true",
"X-B3-SpanId": "5f87ed85b82a627f",
"X-B3-TraceId": "5f87ed85b82a627f",
"path": "/blah",
"method": "POST",
"applicationLabel": "XYZ",
"sub_context": "Some subcontext",
"correlation_id": "5f87ed85b82a627f",
"request": {
"blah": "Spring-boot-blah-id"
}
}
"applicationLabel": "XYZ",
"sub_context": "Some subcontext",
"correlation_id": "5f87ed85b82a627f",
"request": {
"blah": "Spring-boot-blah-id"
}
The second snippet is a subset of the log and is custom . We would like the logging to happen this way.
+1 we are also working on streamlining our structured logging where i work.
@loicmathieu : Is there a planned roadmap for the structured logging with logstash support to be released in the next 1, 2 or 3 month period ?
I have look a the code, and it does not look to be an easy task.
It looks like the org.jboss.slf4j:slf4j-jboss-logging
dependency, is include in the quarkus-maven-plugin
.
The JBoss slf4j implementation, also throws away the Markers and Parameters/Args, so it's not possible to send extra infomation to the JsonFormatter
, and do some custom tagging there.
I see two ways forward:
quarkus-logging-json
extension.I have created an extension inspired by the logstash-logback-encoder, it is just a POC https://github.com/SlyngDK/quarkus-logging-json-structured. I have implemented some of the basic logging data, it's using jackson for the json part. It's also possible to extends your own custom json. Be aware slf4j jboss bridge is not forwarding parameters/arguments, in version 1.2.0 PR to fix
Is it something like this people is interested in?? I think this will help solve our issues.
Hi @SlyngDK,
Firstly I would like to thank you for the effort being put in and I really liked the fact that the JBossLoggerAdapter can now pass the arguments down the framework . This helps us getting a logging event with parameters and help us have an extension to record it the way we want in the logs ( in our case being a JsonProvider for Arguments)
I wrote up an extension for the StructuredArgument as present in your Readme and I could get the log to be written like this :
{
"@timestamp":"2020-09-03T11:44:30.423+10:00",
"level":"INFO",
"logger_name":"Blah",
"logger_class_name":"org.jboss.slf4j.JBossLoggerAdapter",
"sequence":2509,
"message":"Request Received successfully",
"host_name":"cmm-blah",
"process_name":"svc-qks-blah-v2-dev.jar",
"process_id":13966,
"thread_name":"vert.x-eventloop-thread-27",
"thread_id":122,
"arguments":{
"REQUEST_OBJ":{
"blah":"abc"
},
"Blah":"abc"
}
}
Couple of things :
a) "arguments" : Kindly help in modifying the framework code to not have "arguments" field as the original logback-logstash extension helps in putting in arguments at the root level.
b) Kindly help provide a default implementation in your "quarkus-logging-json-structured" as a KeyValueStructuredArgument which will help us do something like
log.info("Some log", kv("REQUEST_OBJ", <<POJO>>)) which gets transformed to
{
//all other log parameters
"REQUEST_OBJ" : {
"blah_id" : "abc"
}
}
If its a Jackson implementation of the JSON Key Value Logging , then the annotations around that can be automatically honoured :)
Again great work !!
Regards, Harish
@harry9111985 thanks for the feedback, I will take a look at it next week.
@loicmathieu
Do you think an improved version of the POC, is an extension we can get into the project?
Maybe replacing the quarkus-logging-json
, and change it to the same default format?
POC: https://github.com/SlyngDK/quarkus-logging-json-structured
@gsmet Will you take a look ^^^
@harry9111985 I have updated the POC, as you suggested. You are welcome to contribute to it.
https://github.com/jboss-logging/slf4j-jboss-logging/pull/10 has been merged.
@SlyngDK : I have been busy with few things . Will take a look today and get back . Thanks for the effort in advance.
slf4j-jboss-logging 1.2.1.Final is also on master.
How do we get this issue rolling?
I have a little update on this.
I have today released a version quarkiverse-logging-json, in the new quarkiverse future place for community extensions. https://github.com/quarkiverse/quarkiverse-logging-json
It support using both jsonb and jackson, I also working on support to configure each field.
Feedback is welcome.
Next quarkus release will also include slf4j-jboss-logging 1.2.1.Final.
I want to change predefined fields as "traceId" and "spanId" to others, is there any way to do it? Because when i use json format, there is no way. Thx
@lferna As I remember these are added to the mdc context, by the tracing library. At the time right now it is not possible. In the POC(https://github.com/SlyngDK/quarkus-logging-json-structured) it was possible to add your own custom JsonProvider, with that you can do it. One of the next task I am looking into, is to add this to quarkiverse-logging-json also.
I was able to use Logback with the logstash encoder, just by adding the dependencies and removing the dependency for slf4j jboss logging.
However this breaks on dev-mode, i cant remove the dependency in there and the application starts with 2 slf4j bindings.
The freedom that logback+logstash encoder gives is amazing. I was able to have custom json logging but i had to hook into mdc to push a json object string and pass it around, very ugly.
How did you manage to pull it off do you have the details?
@SlyngDK Could you please help me how to do this .. any sample please
log.info("Some log", kv("REQUEST_OBJ", <
{ //all other log parameters "REQUEST_OBJ" : { "blah_id" : "abc" }
}
@SlyngDK Could you please help me how to do this .. any sample please log.info("Some log", kv("REQUEST_OBJ", <>)) which gets transformed to
{ //all other log parameters "REQUEST_OBJ" : { "blah_id" : "abc" }
}
@harry9111985 how did you manage to generate, any sample please
I tried this but no luck @harry9111985 @SlyngDK log.error("Test {}", "message", KeyValueStructuredArgument.kv("structuredKey", "structuredValue"), new RuntimeException("Testing stackTrace"));
@SlyngDK Could you please help me how to do this .. any sample please log.info("Some log", kv("REQUEST_OBJ", <>)) which gets transformed to { //all other log parameters "REQUEST_OBJ" : { "blah_id" : "abc" } }
@harry9111985 how did you manage to generate, any sample please
I tried this but no luck @harry9111985 @SlyngDK log.error("Test {}", "message", KeyValueStructuredArgument.kv("structuredKey", "structuredValue"), new RuntimeException("Testing stackTrace"));
@reachlakstar
I could get this working . Its 2 simple dependencies .
<dependency>
<groupId>io.quarkiverse.loggingjson</groupId>
<artifactId>quarkiverse-logging-json</artifactId>
<version>0.1.7</version>
</dependency>
<dependency>
<groupId>org.jboss.slf4j</groupId>
<artifactId>slf4j-jboss-logging</artifactId>
<version>1.2.1.Final</version>
</dependency>
And a log statement like this : logger.info("Some log message",kv("abc","blah"),kv("def","blah1"));
Result is like this :
{"timestamp":"2020-11-22T23:02:15.922Z","sequence":39,"loggerClassName":"org.jboss.slf4j.JBossLoggerAdapter","loggerName":"SomeController","level":"INFO","message":"Post Request Received successfully","threadName":"vert.x-eventloop-thread-2","threadId":18,"mdc":{},"hostName":"4073bd5bae35","processName":"app.jar","processId":1,"abc":"blah","def":"blah1"}
On Quarkus 1.7.1
@SlyngDK : Amazing work man ! Sorry was in a hibernation period with some other project . Tried your improvements and new library , works like a charm.
@harry9111985 Thanks for the feedback
@reachlakstar This dependency is not required with quarkus 1.10.0.Final.
<dependency>
<groupId>org.jboss.slf4j</groupId>
<artifactId>slf4j-jboss-logging</artifactId>
<version>1.2.1.Final</version>
</dependency>
Hey guys, for everyone who is here because of better logging to GCP. You can use the following library, as stated before:
implementation("io.quarkiverse.loggingjson:quarkiverse-logging-json:0.1.7") {
exclude("io.quarkus", "quarkus-core")
exclude("io.quarkus", "quarkus-jackson")
exclude("io.quarkus", "quarkus-jsonb")
}
And configure it by writing your own JsonLoggingProvider to match GCP requirements:
package de.blume2000.kaufen.adapter.passive.operations
import io.quarkiverse.loggingjson.JsonGenerator
import io.quarkiverse.loggingjson.JsonProvider
import io.quarkiverse.loggingjson.JsonWritingUtils
import io.quarkiverse.loggingjson.StringBuilderWriter
import io.quarkus.arc.AlternativePriority
import org.eclipse.microprofile.config.inject.ConfigProperty
import org.jboss.logmanager.ExtFormatter
import org.jboss.logmanager.ExtLogRecord
import java.io.IOException
import java.io.PrintWriter
import javax.inject.Singleton
import javax.interceptor.Interceptor.Priority.PLATFORM_AFTER
@Singleton
@AlternativePriority(PLATFORM_AFTER)
class JsonLoggingProvider(
@ConfigProperty(name = "b2k.gcp-project.environment")
private val environment: String,
@ConfigProperty(name = "b2k.team.key")
private val teamKey: String,
@ConfigProperty(name = "quarkus.application.name")
private val applicationName: String
) : JsonProvider, ExtFormatter() {
@Throws(IOException::class)
override fun writeTo(generator: JsonGenerator, event: ExtLogRecord) {
writeMessage(generator, event)
writeSourceLocation(generator, event)
writeLabels(generator)
}
private fun writeMessage(generator: JsonGenerator, event: ExtLogRecord) {
val stringBuilderWriter = StringBuilderWriter().append(formatMessage(event))
event.thrown?.let {
stringBuilderWriter.append('\n')
it.printStackTrace(PrintWriter(stringBuilderWriter))
writeLevel(generator, "ERROR")
} ?: writeLevel(generator, severityFor(event.level.intValue()))
JsonWritingUtils.writeStringField(generator, "message", stringBuilderWriter.toString().trim())
}
private fun writeLevel(generator: JsonGenerator, severity: String) {
JsonWritingUtils.writeStringField(generator, "severity", severity)
if (severity == "ERROR") {
JsonWritingUtils.writeStringField(generator, "@type",
"type.googleapis.com/google.devtools.clouderrorreporting.v1beta1.ReportedErrorEvent")
}
}
private fun severityFor(levelValue: Int) =
when (levelValue) {
1000 -> "ERROR"
900 -> "WARNING"
800 -> "INFO"
700 -> "INFO"
500 -> "DEBUG"
400 -> "DEBUG"
300 -> "DEBUG"
else -> "DEFAULT"
}
private fun writeSourceLocation(generator: JsonGenerator, event: ExtLogRecord) {
generator.writeObjectFieldStart("sourceLocation")
generator.writeStringField("file", event.loggerClassName)
event.sourceMethodName?.let {
generator.writeStringField("function", it)
generator.writeStringField("line", event.sourceLineNumber.toString())
}
generator.writeEndObject()
}
private fun writeLabels(generator: JsonGenerator) {
generator.writeObjectFieldStart("labels")
generator.writeStringField("environment", environment)
generator.writeStringField("team", teamKey)
generator.writeStringField("application", applicationName)
generator.writeEndObject()
}
override fun format(extLogRecord: ExtLogRecord?) = null
}
quarkus:
log:
console:
json:
~: false
fields:
level:
enabled: false
message:
enabled: false
stack-trace:
enabled: false
logger-class-name:
enabled: false
logger-name:
enabled: false
If you know any other way please reach out to me. The default google libraries don't work together with quarkus as I understand it. Because they are either using JUL or Logback.
@BeneStem this is indead interesting.
A new "Deploying to Google Cloud" guide has been created on 1.10 (soon to be released).
GCP specific libraries resides in https://github.com/quarkiverse/quarkiverse-google-cloud-services there nothing related to log and monitoring yet, they may be some stackdriver related support some day.
Maybe this information and piece of code can resides in one of these two areas for better discovery
It's wrong use default console logging, and configure console.format like this?:
quarkus.log.level=INFO
quarkus.log.console.format={"level": "%p", "at": "%d{yyyy-MM-dd'T'HH:mm:ssZ}", "class": "%C", "message": "%s", "error": "%e"}%n
Logs will be showing like this:
{"level": "INFO", "at": "2020-12-03T15:43:35-0300", "class": "io.quarkus.bootstrap.runner.Timing", "message": "quarkus 1.0.0 on JVM (powered by Quarkus 1.9.2.Final) started in 2.787s. Listening on: http://0.0.0.0:8080", "error": ""}
{"level": "INFO", "at": "2020-12-03T15:43:35-0300", "class": "io.quarkus.bootstrap.runner.Timing", "message": "Profile dev activated. Live Coding activated.", "error": ""}
{"level": "INFO", "at": "2020-12-03T15:43:35-0300", "class": "io.quarkus.bootstrap.runner.Timing", "message": "Installed features: [agroal, cdi, hibernate-orm, hibernate-orm-panache, jdbc-postgresql, liquibase, mutiny, narayana-jta, resteasy, resteasy-jsonb, smallrye-context-propagation]", "error": ""}
It's wrong use default console logging, and configure console.format like this?:
quarkus.log.level=INFO quarkus.log.console.format={"level": "%p", "at": "%d{yyyy-MM-dd'T'HH:mm:ssZ}", "class": "%C", "message": "%s", "error": "%e"}%n
Logs will be showing like this:
{"level": "INFO", "at": "2020-12-03T15:43:35-0300", "class": "io.quarkus.bootstrap.runner.Timing", "message": "quarkus 1.0.0 on JVM (powered by Quarkus 1.9.2.Final) started in 2.787s. Listening on: http://0.0.0.0:8080", "error": ""} {"level": "INFO", "at": "2020-12-03T15:43:35-0300", "class": "io.quarkus.bootstrap.runner.Timing", "message": "Profile dev activated. Live Coding activated.", "error": ""} {"level": "INFO", "at": "2020-12-03T15:43:35-0300", "class": "io.quarkus.bootstrap.runner.Timing", "message": "Installed features: [agroal, cdi, hibernate-orm, hibernate-orm-panache, jdbc-postgresql, liquibase, mutiny, narayana-jta, resteasy, resteasy-jsonb, smallrye-context-propagation]", "error": ""}
This will not properly format the stack trace in exceptions. The last thing I want to do is to copy paste exceptions to a text editor to prettify them.
This is a much needed feature in quarkus
+1
I'm using the quarkus-logging-json artifact. I'm looking for a way to customize the json logging format. Our case is that we are using ELK stack for logging and all of our services (spring boot) are logging in the stdout entries formatted with the logstash-logback-encoder
https://github.com/logstash/logstash-logback-encoder
I want to output logs with the same format with Quarkus
For example in my case these fields:
@timestamp | Time of the log event. (yyyy-MM-dd'T'HH:mm:ss.SSSZZ) See customizing timestamp. @version | Logstash format version (e.g. 1) See customizing version. message | Formatted log message of the event logger_name | Name of the logger that logged the event thread_name | Name of the thread that logged the event level | String name of the level of the event level_value | Integer value of the level of the event stack_trace | (Only if a throwable was logged) The stacktrace of the throwable. Stackframes are separated by line endings.
if you refer to the https://github.com/logstash/logstash-logback-encoder readme, you'll see how much flexible it is. I believe that many organizations will need to be able to control the logs format and add custom fields as well.