DataDog / datadog-lambda-go

The Datadog AWS Lambda package for Go
Apache License 2.0
59 stars 40 forks source link

Error values are logged as INFO level logs instead of ERRORs in datadog #93

Closed arashout closed 2 years ago

arashout commented 3 years ago

Expected Behavior

Expected the error to be have a level:error field on the log message so Datadog correct intrepets the log.

Actual Behavior

The Datadog Log Forwarder function doesn't seem to add an level:error to log messages that are actually errors that have bubbled up to the final lambda handler.

So it's displayed as an INFO log in Datadog instead of ERROR log

Steps to Reproduce the Problem

Have a simple handler like this with an error:

func handler(ctx context.Context, sqsEvent events.SQSEvent) error {

    err := processSQSEvent(ctx, &sqsEvent)
    if err != nil {
        return errors.Wrap(err, "error processing the SQS event")
    }
    return nil
}

func main() {
    lambda.Start(ddlambda.WrapHandler(handler, nil))
}

If processSQSEvent returns an err you will see "error processing the SQS event" without a log level field being forwarded to datadog.

How can I resolve this?

Do I need to format the error as a JSON log when returning it?

Specifications

arashout commented 3 years ago

Note that I posted originally here: https://github.com/DataDog/datadog-serverless-functions/issues/498#issuecomment-922183863 and was re-directed to this repo

DarcyRaynerDD commented 2 years ago

Hi @arashout. Sorry for the delayed response. We detect logs as errors by inspecting the contents of the log with a log pipeline https://docs.datadoghq.com/logs/log_configuration/pipelines/?tab=source . By default, you get a generic pipeline for lambda, (but you can customize it and add your own parsing rules to extract errors). Alternatively, you can do what you suggested, and swap to a JSON formatted payload. The default lambda pipeline will parse the JSON, and if they have an "error" field, will mark the entire log as an error.

DarcyRaynerDD commented 2 years ago

I'm going to close this issue for now. But feel free to reopen if you run into any more issues.