opensearch-project / data-prepper

Data Prepper is a component of the OpenSearch project that accepts, filters, transforms, enriches, and routes data at scale.
https://opensearch.org/docs/latest/clients/data-prepper/index/
Apache License 2.0
259 stars 191 forks source link

[BUG] STDOUT not respecting Log4J configuration #2116

Open iquirino opened 1 year ago

iquirino commented 1 year ago

Describe the bug I've updated my log4j configuration and it stopped to save logs o file system, so, it worked well. I've fixed all log levels to errors but data-prepper continues printing the content of the logs received.

To Reproduce 1: Updated log4j configutation at '/usr/share/data-prepper/config/log4j2-rolling.properties' with:

status = error
dest = err
name = PropertiesConfig

property.filename = log/data-prepper/data-prepper.log

appender.console.type = Console
appender.console.level = error
appender.console.name = STDOUT
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d{ISO8601} [%t] %-5p %40C - %m%n

rootLogger.level = error
rootLogger.appenderRef.stdout.ref = STDOUT

logger.pipeline.name = org.opensearch.dataprepper.pipeline
logger.pipeline.level = error

logger.parser.name = org.opensearch.dataprepper.parser
logger.parser.level = error

logger.plugins.name = org.opensearch.dataprepper.plugins
logger.plugins.level = error
  1. I am currently using the default pipeline for open telemetry use case (/usr/share/data-prepper/pipelines/pipelines.yaml):
entry-pipeline:
  delay: "100"
  source:
    otel_trace_source:
      ssl: false
  sink:
    - pipeline:
        name: "raw-pipeline"
    - pipeline:
        name: "service-map-pipeline"
raw-pipeline:
  source:
    pipeline:
      name: "entry-pipeline"
  processor:
    - otel_trace_raw:
  sink:
    - opensearch:
        hosts: [ "https://opensearch.stag.bolttechbroker.net" ]
        username: "edirect"
        password: "123@PenguinsCanFly;)"
        index_type: "trace-analytics-raw"
service-map-pipeline:
  delay: "100"
  source:
    pipeline:
      name: "entry-pipeline"
  processor:
    - service_map_stateful:
  sink:
    - opensearch:
        hosts: [ "https://opensearch.stag.bolttechbroker.net" ]
        username: "edirect"
        password: "123@PenguinsCanFly;)"
        index_type: "trace-analytics-service-map"

Expected behavior I dont want to have all requests logged to my console, I just want to have errors logged to it.

Screenshots image

Environment (please complete the following information):

Additional context I am currently using the lates version of data-prepper

JannikBrand commented 1 year ago

I have seen this type of log before. I my case it was not logging a normal span which gets processed, but belonged to I think a "Buffer has not enough capacity" error log. In your screenshot there is also no sign, that this is a log below the error log level (e.g. that this is a log with INFO log level)

iquirino commented 1 year ago

You are right: image

There are any way to decrease the size of this logs? Let's say, skip the stdout of the traces.

Thank you @JannikBrand

dlvenable commented 1 year ago

@iquirino, What is the log line that leads up to this message? You could possibly turn logging off for that particular logger.

toby181 commented 1 year ago

Hi guys, I'm having the same issue basically. Data-prepper is sending failed traces to stdout and completely messing with my OpenSearch logs since each line is one message in OS.

@dlvenable The preceding line should be "WARN org.opensearch.dataprepper.plugins.source.oteltrace.OTelTraceGrpcService - Failed to parse request with error 'name cannot be an empty string'. Request body: resource_spans {". I assume that @iquirino has a similar issue.

toby181 commented 1 year ago

I guess you mean something like this?

logger.parser.name = org.opensearch.dataprepper.parser logger.parser.level = error

Is there another way to handle failing traces, instead of sending them to stdout... sending to a separate index or something like that? Or just delete them.