pinojs / pino

🌲 super fast, all natural json logger
http://getpino.io
MIT License
14.21k stars 875 forks source link

Transporter - Datadog #1001

Closed gkatsanos closed 1 year ago

gkatsanos commented 3 years ago

I added pino and tried to see my logs in Datadog unsuccessfully. Later I read the section regarding the limitations and design decisions of Pino with regards to logging in process vs out of process.

I was wondering how this can work with regards to passing the transporter on the cl level (keys?). I'm afraid the impact of this design decision is that integrating the logger involves devops (instead of having everything at the code level).

Could you provide some insights? Do I have to use pino-multi-stream ?

Thank you for your time :)

mcollina commented 3 years ago

What are you using to ship logs to DataDog?

gkatsanos commented 3 years ago

Hey Matteo, appreciate the quick feedback. Bear with me if I dont get the facts right but we have a dockerized app and from my understanding stdout/stderr is auto-magically piped to DD. ( https://docs.datadoghq.com/agent/docker/log/?tab=containerinstallation ) . We don't do anything manually. BTW the application is a Nuxt (node SSR+Client side) app which logs both on client and on server side.

locally I see the Pino logs just fine, but deployed and dont get logs from DD (which is to be expected as I did not add the pino-datadog library). I just need to understand what else is needed.

jsumners commented 3 years ago

DataDog natively supports Pino. Load dd-trace before you load pino and DD will instrument Pino. This will in turn ship your logs to DD via their sidecar.

gkatsanos commented 3 years ago

@jsumners

const tracer = require('dd-trace')

export default function Datadog() {
  tracer.init({
    hostname: process.env.DD_AGENT_HOST,
    service: 'xxx',
    env: process.env.STAGE,
    logInjection: true,
  })
}

later..

const logger = require('pino')()
$axios.onRequest((config) => {
      logger.info({
        // date: new Date().toISOString(),
        http: {
          params: config.params,
          baseURL: config.baseURL,
          referer: config.headers.common.referer,
          url: config.url,
          method: config.method
        },
      }, `[${config.method}] ${config.url}`)

in the server logs / terminal : image

(if DD natively supports Pino, why's pino-datadog necessary? ) I thought pino doesn't send logs anywhere unless you explicitely set it up to do so.

jsumners commented 3 years ago

if DD natively supports Pino, why's pino-datadog necessary?

Such questions should be asked to the DD support team.

I thought pino doesn't send logs anywhere unless you explicitely set it up to do so.

As of v6, Pino never ships logs anywhere. That is completely the user's responsibility. What I said is that DD supports instrumenting the Pino logger. See https://docs.datadoghq.com/tracing/connect_logs_and_traces/nodejs/

gkatsanos commented 3 years ago

@jsumners can you elaborate on "instrumenting" ?

jsumners commented 3 years ago

It's outlined in the link I provided in my last reply. Additionally https://opentracing.io/docs/best-practices/instrumenting-frameworks/

gkatsanos commented 3 years ago

@jsumners ah yes , instrumenting is not what I'm asking here. Instrumenting as seen above in my screenshot does happen, but I'm one step before that: I'm not sure how to pipe the logs to Datadog in the first place. I hope @mcollina will have some pointers on how to proceed with regards to pino-datadog and pino-multi-stream ?

Fdawgs commented 1 year ago

@gkatsanos Did you manage to resolve this?

gkatsanos commented 1 year ago

We used simple console log and we did our own formatting.

github-actions[bot] commented 1 year ago

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.