pinojs / pino

🌲 super fast, all natural json logger
http://getpino.io
MIT License
14.21k stars 875 forks source link

Dropping chars #1261

Closed ronag closed 2 years ago

ronag commented 2 years ago

e.g.

{"level":10,"time":1639142154887,"pid":1,"hostname":"51a2eee7f0d9","name":"search","id":16273,"type":"simple","query":{"type":"simple","roles":["MTuMLU_4ZwXPSN","administrator"],"query":"types:panel -tags.keyword:nxt-master","count":1000,"deleted":faen":null,"sortBy":"-created","refreshInterval":10000},"refreshInterval":10000,"msg":"Refreshing search"}

Notice the part:

"deleted":faen"

Should be:

"deleted":false,"hidden":null

Missing:

lse,"hidd

We have other examples where a few chars seems to be missing.

pino.destination({ fd: process.stdout.fd, sync: false, minLength: 4096 })
sonic-boom@^2.2.0, sonic-boom@^2.2.1, sonic-boom@^2.3.1:
  version "2.4.1"
pino@^7.5.1:
  version "7.5.1"
thread-stream@^0.13.0:
  version "0.13.0"

Running in docker with node:16.13.1

The logs come directly from the console log, i.e. no file or anything involved.

ronag commented 2 years ago

@mcollina FYI

jsumners commented 2 years ago

Could this be due to the async buffer getting full and truncating lines? https://github.com/pinojs/thread-stream/pull/40#discussion_r738870485

ronag commented 2 years ago

Could be

ronag commented 2 years ago

Do we go through thread-stream even when writing to stdout? That seems unnecessary?

ronag commented 2 years ago

If it thread-stream is in play here I suspect an issue with the overflow logic.

mcollina commented 2 years ago

I think something that would help is a script to replicate the problem. I've tested this at the best I could and there is a test that writes a lot of data triggering the overflow multiple times.

ronag commented 2 years ago

I haven't managed to create a repro yet... However, the following code is suspicious:

https://github.com/pinojs/thread-stream/blob/main/index.js#L216-L226

    if (this[kImpl].flushing && this[kImpl].buf.length + data.length >= MAX_STRING) {
      try {
        writeSync(this)
        this[kImpl].flushing = true
      } catch (err) {
        destroy(this, err)
        return false
      }
    }

    this[kImpl].buf += data

There is a case here where if flushing we append to buf even if it will exceed MAX_STRING.

ronag commented 2 years ago

I'm also unsure how writeSync in e.g. stream.flushSync will behave when flushing === true.

mcollina commented 2 years ago

That code is essentially correct. In case the string exceed MAX_STRING, it triggers a writeSync with the current content of the buffer. writeSync then empties the buffer. Finally we add the new content to append to the Buffer.

ronag commented 2 years ago

I think you are missing the flushing check?

ronag commented 2 years ago

i.e.

stream.write('a'.repeat(MAX_STRING - 1))
assert(stream[kState].flushing)
stream.write('asd') // oops buf size will exceeds MAX_STRING
ronag commented 2 years ago

See test/string-limit-2.test.js in https://github.com/pinojs/thread-stream/pull/55/files

ronag commented 2 years ago

I'm not sure this is the problem I'm encountering in production but it's still something.

mcollina commented 2 years ago

Did this happen again? Were you able to reproduce it?

ronag commented 2 years ago

Happens all the time in production. Don’t have a runnable repro. We were forced to change our service to log less and then the problem disappeared.

mcollina commented 2 years ago

Very interesting. What transports are you using?

ronag commented 2 years ago

Just normal stdout. The default. pino.destination with async true.

ronag commented 2 years ago

My girlfriend will kill me if I take my computer out now so I will have to get back later with the exact configuration. 😅

I think it’s 4096 buffering limit.

ronag commented 2 years ago

@mcollina

stream = pino.destination({ fd: process.stdout.fd, sync: false, minLength: 4096 })
mcollina commented 2 years ago

Is that being wrapped inside pino.transport()? why you are not using pino.destination() directly?

ronag commented 2 years ago
 if (!extreme) {
    stream = pino.destination({ fd: process.stdout.fd, sync: true })
  } else {
    stream = pino.destination({ fd: process.stdout.fd, sync: false, minLength: 4096 })
    setInterval(() => {
      logger.flush()
    }, flushInterval).unref()
  }

  const logger = pino(
    {
      serializers,
      prettyPrint, // TODO (fix): This is deprecated...
      level,
      ...options,
    },
    stream
  )
ronag commented 2 years ago

Not explicitly using transport.

mcollina commented 2 years ago

This is definitely not about thread-stream but something in our old friend https://github.com/pinojs/sonic-boom.

mcollina commented 2 years ago

It's unclear which version of sonic-boom are you using. Could you check if you are running the latest?

ronag commented 2 years ago
sonic-boom@^2.2.0, sonic-boom@^2.2.1, sonic-boom@^2.3.1:
  version "2.4.1"
  resolved "https://registry.yarnpkg.com/sonic-boom/-/sonic-boom-2.4.1.tgz#6e1c3762c677425c6ffbd7bd106c4f8258b45b39"
  integrity sha512-WgtVLfGl347/zS1oTuLaOAvVD5zijgjphAJHgbbnBJGgexnr+C1ULSj0j7ftoGxpuxR4PaV635NkwFemG8m/5w==
  dependencies:
    atomic-sleep "^1.0.0"
mcollina commented 2 years ago

You might want to look into adding a test similar to https://github.com/pinojs/thread-stream/blob/main/test/string-limit.test.js to sonic-boom.

RichAyotte commented 2 years ago

Adding a "me too" to this issue. I'm seeing this happen all the time in production, especially with the larger payloads. I just had one cut off at 61440 characters if that's a number of any significance. I'll look for other instances to see if there's a pattern.

mcollina commented 2 years ago

Could you write a script to reproduce?

mcollina commented 2 years ago

Can you please try https://github.com/pinojs/sonic-boom/pull/128 and see if it fixes it?

klon commented 2 years ago

We noticed that since upgrading to 7.6.2 from 6.13.2 we started seeing bad JSON output from pino intermittently. Lost characters mid string. Once it started happening on an instance we typically had to kill and restart nodejs process to get it to work again. Reverting to 6.13.2 resolves it.

mcollina commented 2 years ago

@klon as I said to @ronag, we need a way to reproduce the problem to be able to fix it.

mcollina commented 2 years ago

This was likely fixed by https://github.com/pinojs/sonic-boom/pull/137.

github-actions[bot] commented 2 years ago

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.