elastic / kibana

Your window into the Elastic Stack
https://www.elastic.co/products/kibana
Other
19.67k stars 8.23k forks source link

Kibana csv export timerange error #200713

Open vanitasKE opened 3 days ago

vanitasKE commented 3 days ago

Kibana version: 8.15.2

Elasticsearch version: 8.15.2 Server OS version: Debian 11 Browser version: Chrome is up to date Version 131.0.6778.70 (Official Build) (64-bit) Browser OS version: Chrome is up to date Version 131.0.6778.70 (Official Build) (64-bit) Original install method (e.g. download page, yum, from source, etc.): apt -get Describe the bug: When i want to export something to csv in kibana, it get following error:

document_parsing_exception Caused by: illegal_argument_exception: Expected text at 1:625 but found START_OBJECT Root causes: document_parsing_exception: [1:728] failed to parse field [payload.searchSource.filter.query.range.@timestamp] of type [date] in document with id '828b9af6-509d-4ed3-bd65-16fa4c028aa9'. Preview of field's value: '{format=strict_date_optional_time, gte=2024-11-19T11:57:36.322Z, lte=2024-11-19T12:12:36.322Z}'

Steps to reproduce: 1. 2. 3.

Expected behavior: CSV export downloadable

Screenshots (if relevant): Image

Errors in browser console (if relevant):

Provide logs and/or server output (if relevant): [2024-11-19T13:15:15.082+01:00][ERROR][plugins.reporting] ResponseError: document_parsing_exception Caused by: illegal_argument_exception: Expected text at 1:625 but found START_OBJECT Root causes: document_parsing_exception: [1:728] failed to parse field [payload.searchSource.filter.query.range.@timestamp] of type [date] in document with id '828b9af6-509d-4ed3-bd65-16fa4c028aa9'. Preview of field's value: '{format=strict_date_optional_time, gte=2024-11-19T11:57:36.322Z, lte=2024-11-19T12:12:36.322Z}' at KibanaTransport.request (/usr/share/kibana/node_modules/@elastic/transport/lib/Transport.js:507:27) at processTicksAndRejections (node:internal/process/task_queues:95:5) at KibanaTransport.request (/usr/share/kibana/node_modules/@kbn/core-elasticsearch-client-server-internal/src/create_transport.js:59:16) at ClientTraced.IndexApi [as index] (/usr/share/kibana/node_modules/@elastic/elasticsearch/lib/api/api/index.js:51:12) at ReportingStore.indexReport (/usr/share/kibana/node_modules/@kbn/reporting-plugin/server/lib/store/store.js:106:12) at ReportingStore.addReport (/usr/share/kibana/node_modules/@kbn/reporting-plugin/server/lib/store/store.js:129:30) at RequestHandler.enqueueJob (/usr/share/kibana/node_modules/@kbn/reporting-plugin/server/routes/common/generate/request_handler.js:107:20) at RequestHandler.handleGenerateRequest (/usr/share/kibana/node_modules/@kbn/reporting-plugin/server/routes/common/generate/request_handler.js:204:16) at /usr/share/kibana/node_modules/@kbn/reporting-plugin/server/routes/internal/generate/generate_from_jobparams.js:41:16 at Router.handle (/usr/share/kibana/node_modules/@kbn/core-http-router-server-internal/src/router.js:194:30) at handler (/usr/share/kibana/node_modules/@kbn/core-http-router-server-internal/src/router.js:128:50) at exports.Manager.execute (/usr/share/kibana/node_modules/@hapi/hapi/lib/toolkit.js:60:28) at Object.internals.handler (/usr/share/kibana/node_modules/@hapi/hapi/lib/handler.js:46:20) at exports.execute (/usr/share/kibana/node_modules/@hapi/hapi/lib/handler.js:31:20) at Request._lifecycle (/usr/share/kibana/node_modules/@hapi/hapi/lib/request.js:371:32) at Request._execute (/usr/share/kibana/node_modules/@hapi/hapi/lib/request.js:281:9) Any additional context:

elasticmachine commented 3 days ago

Pinging @elastic/appex-sharedux (Team:SharedUX)

tsullivan commented 3 days ago

Hi @vanitasKE it looks like we need steps to reproduce here.

  1. Can you give us a sample of documents we can ingest that will cause the error when we try to export?
  2. Are you using range field type? https://www.elastic.co/guide/en/elasticsearch/reference/8.15/range.html
vanitasKE commented 2 days ago

Hi, @tsullivan , This is the JSON document. Its happening in every index with every event to everything i want to export.

{ "_index": "linux-000269", "_id": "BK9ASJMBrSKHMrjK1aID", "_version": 1, "_score": 0, "_source": { "syslogrelay1-ip": "10.1.5.20", "facility": "authpriv", "type": "linux", "device-timestamp": "2024-11-20T06:27:08.264Z", "srv-ip": "10.10.10.10", "srv-name": "srv-name", "service": "sudo", "timestamp-difference": 0, "@timestamp": "2024-11-20T06:27:08.264Z", "event": {}, "log": { "file": { "path": "/var/ls/remotelogs/linux/ba-locofwd.log" } }, "msg": "robot_autorepl : PWD=/ ; USER=root ; COMMAND=/usr/sbin/blockdev --flushbufs /dev/loop110", "host": "parser03", "severity": "notice" }, "fields": { "msg": [ "robot_autorepl : PWD=/ ; USER=root ; COMMAND=/usr/sbin/blockdev --flushbufs /dev/loop110" ], "severity": [ "notice" ], "srv-ip": [ "10.1.156.122" ], "type": [ "linux" ], "msg.keyword": [ "robot_autorepl : PWD=/ ; USER=root ; COMMAND=/usr/sbin/blockdev --flushbufs /dev/loop110" ], "severity.keyword": [ "notice" ], "facility.keyword": [ "authpriv" ], "@timestamp": [ "2024-11-20T06:27:08.264Z" ], "service": [ "sudo" ], "srv-name": [ "srv-name" ], "type.keyword": [ "linux" ], "log.file.path": [ "/var/ls/remotelogs/linux/ba-locofwd.log" ], "srv-name.keyword": [ "srv-name" ], "host": [ "parser03" ], "service.keyword": [ "sudo" ], "timestamp-difference": [ 0 ], "device-timestamp": [ "2024-11-20T06:27:08.264Z" ], "host.keyword": [ "parser03" ], "facility": [ "authpriv" ], "syslogrelay1-ip": [ "10.10.10.10" ], "log.file.path.keyword": [ "/var/ls/remotelogs/linux/ba-locofwd.log" ] } }

Also when i click on see full error i see this:

Error: Internal Server Error at fetch_Fetch.fetchResponse (https://elk.com/5a522bfe14bc/bundles/core/core.entry.js:16:219907) at async https://elk.com/5a522bfe14bc/bundles/core/core.entry.js:16:217899 at async https://elk.com/5a522bfe14bc/bundles/core/core.entry.js:16:217856

I use field type of date from @timestamp. And i am choosing timerange in Kibana discover tab. on the top right side. Image

tsullivan commented 2 days ago

@vanitasKE let me walk through the steps with the information I have, and please try to correct me where the steps need to change.

  1. Go to Dev Tools and add data to Elasticsearch in an index called my-linux

    # Create an index
    PUT /my-linux
    
    # Add a document
    POST /my-linux/_doc
    {
      "id": "BK9ASJMBrSKHMrjK1aID",
      "syslogrelay1-ip": "10.1.5.20",
      "facility": "authpriv",
      "type": "linux",
      "device-timestamp": "2024-11-20T06:27:08.264Z",
      "srv-ip": "10.10.10.10",
      "srv-name": "srv-name",
      "service": "sudo",
      "timestamp-difference": 0,
      "@timestamp": "2024-11-20T06:27:08.264Z",
      "event": {},
      "log": {
        "file": {
          "path": "/var/ls/remotelogs/linux/ba-locofwd.log"
        }
      },
      "msg": "robot_autorepl : PWD=/ ; USER=root ; COMMAND=/usr/sbin/blockdev --flushbufs /dev/loop110",
      "host": "parser03",
      "severity": "notice"
    }
  2. Create a data view in Kibana, using @timestamp as the timestamp field Image
  3. Use the data view to create a saved search in Discover. Choose a relative time range in the time picker (top right side) that matches the data. Image
  4. Click "Share" > "Export" > "Generate CSV". This creates a report
  5. At this point, my report is created successfully and I can download and view the CSV file in a spreadsheet application Image

There is something missing from the steps to reproduce this issue. If you try the above steps in your environment, does the export work on your side?

tsullivan commented 2 days ago

@vanitasKE one thing I note, is that the JSON document you provided has a different ID than is given in the error message.

Your latest message says:

This is the JSON document. ... "_id": "BK9ASJMBrSKHMrjK1aID",

But the error says:

in document with id '828b9af6-509d-4ed3-bd65-16fa4c028aa9'

I suspect the 828b9af6-509d-4ed3-bd65-16fa4c028aa9 document has an invalid value for the @timestamp field which needs to be addressed in your data. I think the field's value is {format=strict_date_optional_time, gte=2024-11-19T11:57:36.322Z, lte=2024-11-19T12:12:36.322Z}, which is not a valid date format. It looks to be a range field type format.

vanitasKE commented 1 day ago

Hello @tsullivan ,

  1. I created index Image

  2. Uploaded data to it Image

  3. Created Data view Image

  4. Created and Saved search data are visible Image

  5. But when i want to create report -> Click "Share" > "Export" > "Generate CSV". I get same error.

Image

vanitasKE commented 1 day ago

@vanitasKE one thing I note, is that the JSON document you provided has a different ID than is given in the error message.

Your latest message says:

This is the JSON document. ... "_id": "BK9ASJMBrSKHMrjK1aID",

But the error says:

in document with id '828b9af6-509d-4ed3-bd65-16fa4c028aa9'

I suspect the 828b9af6-509d-4ed3-bd65-16fa4c028aa9 document has an invalid value for the @timestamp field which needs to be addressed in your data. I think the field's value is {format=strict_date_optional_time, gte=2024-11-19T11:57:36.322Z, lte=2024-11-19T12:12:36.322Z}, which is not a valid date format. It looks to be a range field type format.

Should i change @timestamp field type to range instead of date?

tsullivan commented 1 day ago

@vanitasKE, can you provide the mapping for your original index? Try opening the Console app and running:

GET /linux-000269/_mapping

Please include the response in this ticket. Thanks!

vanitasKE commented 8 hours ago

@vanitasKE, can you provide the mapping for your original index? Try opening the Console app and running:

GET /linux-000269/_mapping

Please include the response in this ticket. Thanks!

Hello, This is my mapping. { "linux-000269": { "mappings": { "properties": { "@timestamp": { "type": "date" }, "cmd": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "cwd": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "device-timestamp": { "type": "date" }, "event": { "type": "object" }, "facility": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "filename": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "host": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "location": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "log": { "properties": { "file": { "properties": { "path": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } } } } } }, "message": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "msg": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "pid": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "service": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "severity": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "sid": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "srv-ip": { "type": "ip" }, "srv-name": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "syslogrelay1-ip": { "type": "ip" }, "syslogrelay2-ip": { "type": "ip" }, "syslogrelay2-timestamp": { "type": "date" }, "tags": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "timestamp-difference": { "type": "long" }, "tty": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "type": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "uid": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "username": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } } } } } }