pinojs / pino-elasticsearch

🌲 load pino logs into Elasticsearch
MIT License
179 stars 67 forks source link

Can't send nested objects #42

Open ly4she opened 4 years ago

ly4she commented 4 years ago

When I pass nested objects as payload to elasticsearch, I'll get ResponseError: mapper_parsing_exception.

Code to reproduce:

const pino = require('pino');
const pinoElastic = require('pino-elasticsearch');

const streamToElastic = pinoElastic({
  index: 'pino',
  type: 'log',
  consistency: 'one',
  node: 'http://localhost:9200',
  'es-version': 6,
  'bulk-size': 200,
  ecs: true,
});

streamToElastic.on('insertError', err => {
  console.error('pino-elasticsearch: insertError', err);
});

const logger = pino({ level: 'info' }, streamToElastic);

logger.info({ a: { b: 11 } }, 'hello world');

Error I've got:

ResponseError: mapper_parsing_exception
    at IncomingMessage.<anonymous> (C:\mscs\node_modules\@elastic\elasticsearch\lib\Transport.js:294:25)
    at IncomingMessage.emit (events.js:215:7)
    at endReadableNT (_stream_readable.js:1184:12)
    at processTicksAndRejections (internal/process/task_queues.js:80:21) {
  name: 'ResponseError',
  meta: {
    body: { error: [Object], status: 400 },
    statusCode: 400,
    headers: {
      warning: '299 Elasticsearch-7.5.1-3ae9ac9a93c95bd0cdc054951cf95d88e1e18d96 "[types removal] Specifying types in document index requests is deprecated, use the typeless endpoints instead (/{index}/_doc/{id}, /{index}/_doc, or /{index}/_create/{id})."',
      'content-type': 'application/json; charset=UTF-8',
      'content-length': '638'
    },
    warnings: [
      '299 Elasticsearch-7.5.1-3ae9ac9a93c95bd0cdc054951cf95d88e1e18d96 "[types removal] Specifying types in document index requests is deprecated, use the typeless endpoints instead (/{index}/_doc/{id}, /{index}/_doc, or /{index}/_create/{id})."'
    ],
    meta: {
      context: null,
      request: [Object],
      name: 'elasticsearch-js',
      connection: [Object],
      attempts: 0,
      aborted: false
    }
  }
}

I think that is a transport responsibility to parse such objects, so I don't want manually stringify every time. And I think that it'll be great the posibility to pass some option like pretty: true for prettifying parsing maybe by JSON.stringify(payload, undefined, 2)

mcollina commented 4 years ago

cc @delvedor would you be able to help?

Eomm commented 4 years ago

@ly4she what version of elastic are you using?

With this config:

docker run -d -p 9200:9200 --rm --name elastic docker.elastic.co/elasticsearch/elasticsearch:6.2.3

You will get:

{
  "took": 58,
  "timed_out": false,
  "_shards": {
    "total": 5,
    "successful": 5,
    "skipped": 0,
    "failed": 0
  },
  "hits": {
    "total": 1,
    "max_score": 1.0,
    "hits": [
      {
        "_index": "pinotest",
        "_type": "log",
        "_id": "nuvHrW8B46OCmJcu1eNa",
        "_score": 1.0,
        "_source": {
          "pino": {
            "a": {
              "b": 11
            }
          },
          "ecs": {
            "version": "1.0.0"
          },
          "@timestamp": "--cut--",
          "message": "hello world",
          "log": {
            "level": 30
          },
          "host": {
            "hostname": "EOMM"
          },
          "process": {
            "pid": 14960
          }
        }
      }
    ]
  }
}
ly4she commented 4 years ago

@Eomm Elasticsearch-7.5.1

Eomm commented 4 years ago

You should check the mapping of the index in your elasticseach instance

curl --location --request GET 'http://localhost:9200/pino/_mapping/log'

The problem here is that there is a mismatch from the mapping created the first time and the new log.

I suggest to delete the index and retry for testing purpose of course