pinojs / pino-elasticsearch

🌲 load pino logs into Elasticsearch
MIT License
179 stars 67 forks source link

Index Api Feature? #73

Open britztopher opened 3 years ago

britztopher commented 3 years ago

I have a situation where bulk api isnt appilcable to my function. I have a lambda and execution time is around 1200ms, so bulk api cant get full enough to flush logs to elastic. Ive tried almost every combination of flush-bytes and flush-interval will no avail. I forked lib and now use index api which works nicely for short running applications. Would like to know if this is something that would be beneficial to library.

The way I see it is that you can choose between bulk or index depending on situation. It would still use the same splitter stream but instead of splitting the docs up it would be just used for the on('data') event from a node stream's perspective.

heres just a snippet from on data event:

  splitter.on('data', doc => {
    console.log('DATA IN SPLITTER: ', doc);
    client
      .index({
        index: getIndexName(doc.time || doc['@timestamp']),
        body: doc,
        type: type,
      })
      .then(
        stats => {
          splitter.emit('insert', stats);
        },
        err => {
          splitter.emit('error', err);
        },
      );
  });

Lmk

mcollina commented 3 years ago

Having an option that enables you to select between bulk and index would be nice.

I don't understand your snippet, it does not seem directly applicable to this lib

britztopher commented 3 years ago

@mcollina is how i reuse splitter as a stream but instead use index api, as to demonstrate how i implemented it in this lib.

Ill see what i can do to open an MR