fluent / fluentd-docs

This repository is deprecated. Go to fluentd-docs-gitbook repository.
49 stars 119 forks source link

parser_json: keep_time_key parameter for v1.0 #507

Closed ecristea closed 6 years ago

ecristea commented 6 years ago

Hi,

It is correct to assume keep_time_key parameter is still supported in parser_json v1.0? (The keep_time_key parameter it is documented in v0.12, and it is not mentioned in v1.0))

Based on attached example it seems that keep_time_key it is respected when sending a single json event and not respected when sending same json event as an array. I attached two data samples with following content: {"foo":"bar", "time": 1523692120} [{"foo":"bar", "time": 1523692120}]

When feeding 1st event fluentd keeps the time field in the record. 2018-04-14 07:48:40.000000000 +0000 app.log: {"foo":"bar","time":1523692120}

However, when using 2nd event fluentd removes the time field from the record. 2018-04-14 07:48:40.000000000 +0000 app.log: {"foo":"bar"}

Is this expected? What configuration should be used to keep the time field in the (fluentd) record?

If it helps, (if docker, docker-compose available) flume can be started with docker (./runWithDocker.sh) or docker-compose (docker-compose up) and events can be sent using ./sendEvent.sh.

fluent_parser_json_keep_time_key.tar.gz

fujimotos commented 6 years ago

It is correct to assume keep_time_key parameter is still supported in parser_json v1.0? (The keep_time_key parameter it is documented in v0.12, and it is not mentioned in v1.0))

Yes it is supported in v1.0 too. The parameter definitions have been moved to the other part of the documentation so please read the article for details of the available parameters.

What configuration should be used to keep the time field in the (fluentd) record?

There is no parameter available to modify the behaviour for now.

If you want an option to configure it, please submit an issue to the main project fluent/fluentd.

ecristea commented 6 years ago

thank you, this can be closed

fujimotos commented 6 years ago

@ecristea FYI I have submitted a patch to address the issue reported by you.

Follow fluent/fluentd#2020 for the progress on this patch.