pinojs / pino-syslog

🌲 A transport for reformatting pino logs into standard syslog format
30 stars 12 forks source link

pino v7 pipelining is not work #15

Closed awwit closed 3 years ago

awwit commented 3 years ago

Hi!

I made such a configuration as you described in the example:

import pino from 'pino'

const transport = pino.transport({
  pipeline: [
    {
      target: 'pino-syslog',
      level: 'info',
      options: {
        modern: false,
        appname: process.env.npm_package_name,
        enablePipelining: true,
      }
    },
    {
      target: 'pino-socket',
      options: {
        mode: 'tcp',
        address: '127.0.0.1',
        port: 8001
      }
    }
  ]
})

pino(transport)

But I don't get any logs.

I spent a lot of time debugging this issue. And here's what I found out:

Your plugin (pipo-syslog) does not transmit any data further down the pipeline. To be sure of this, I wrote my transport and replaced pino-socket:

const tls = require('tls')
const { pipeline, Writable } = require('stream')
const build = require('pino-abstract-transport')

module.exports = function createTransport(opts) {
  return build(
    function (source) {
      console.log('createTransport', opts) // Console-1

      source.on('error', console.error)
      source.on('close', () => console.info('source close'))

      source.on('data', function (data) {
        console.log('source data', data) // Console-2
        stream.write(JSON.stringify(data))
      })

      const socket = tls.connect(opts, () => {})

      socket.on('data', function (data) {
        console.info('socket data', data)
      })

      socket.on('error', console.error)
      socket.on('close', console.info)

      const stream = new Writable({
        open: true,
        autoDestroy: true,
        close() {
          socket.end()
        },
        write(data, encoding, callback) {
          console.log('tls.connect write', data.toString()) // Console-3
          socket.write(data)
          callback()
        },
      })

      return stream
    },
    {
      enablePipelining: true,
    }
  )
}

After starting the logger, I only see the message of Console-1. When I try to send data to the logger, nothing happens. I don't see a single message.

But if I comment out your plugin in the configuration, then I will receive all messages from the logger (without the transformations that your plugin does, since it is disabled).

You need to fix the data transfer to the next pipeline.

Hope you can help me solve this problem!

mcollina commented 3 years ago

@Eomm could you take a look?

awwit commented 3 years ago

@mcollina @Eomm I looked in more detail what could be the problem and this is what I found out:

Transferring data from pino-syslog further down the pipeline is only possible if I change the code here: https://github.com/pinojs/pino-abstract-transport/blob/main/index.js#L11-L44

From:

const steam = split(function (line) { ... })

To:

const stream = new PassThrough()

But then there is no data transformation. It looks like the split stream is receiving no input. (From pino-sislog to my transport.) I still don't understand why this is happening.

awwit commented 3 years ago

@mcollina @Eomm hi guys!

So, I figured out what was the matter.

The problem is really in the pino-abstract-transport module. Internally, you use the split2 module, which passes data on only if it encounters line breaks.

The pino-syslog module works well and does everything right. It converts the logs to syslog format and then the data goes to the split2 stream. In which the data gets stuck, due to the fact that a line break is expected.

In general, it seemed to me inappropriate to use the split2 module inside the pino-abstract-transport. It turns out that each transport must transfer data with line breaks, otherwise it will not work.

It would be better to move this functionality (splitting data into lines) into a separate transport, which is connected to the pipeline like all other plugins.

Eomm commented 3 years ago

This explains why the tests were working

​    ...​getConfigPath​(​'5424'​,​ ​'newline.json'​)

I just pick one random file configuration

So this is a workaround actually

jsumners commented 3 years ago

Unfortunately, the baseline syslog format (https://datatracker.ietf.org/doc/html/rfc3164#section-4.1.3) states:

The MSG part will fill the remainder of the syslog packet. This will usually contain some additional information of the process that generated the message, and then the text of the message. There is no ending delimiter to this part.

Also, modern syslog also does not specify an "ending delimiter" -- https://datatracker.ietf.org/doc/html/rfc5424#section-6

pino-syslog was only ever intended to receive NDJSON and format each individual line into a syslog formatted line.

mcollina commented 3 years ago

Here is an issue for pino-socket: https://github.com/pinojs/pino-socket/issues/46.

awwit commented 3 years ago

@mcollina @Eomm I apologize, I made a mistake here myself when creating an issue.

There are no problems in any of your modules (pino-syslog and pino-socket). The pino-socket module does not use pino-abstract-transport at all, which means it has no problems with split2.

The problem just arose with my own transport when I implemented it via pino-abstract-transport. This only means that it is impossible to add two modules to the pipeline implemented through pino-abstract-transport.

I apologize to you guys for misleading you.