Open talvey opened 9 years ago
I can confirm, I just experimented the same problem.
My workaround is to convert the timestamp into a string with grok:
filter {
grok {
add_field { "timestamp" => "@timestamp" }
}
}
and then
partition_key_format => "%{timestamp}"
Ugly, but seems to do the job.
I solved it similarly (but, perhaps even uglier) using the ruby filter. Something like the following.
ruby {
code => "event['timestampkey'] = event['@timestamp'].to_s"
}
and then
partition_key_format => "%{timestampkey}"
In Logstash 1.5.0 I was using
to semi-randomly distribute messages across partitions. After upgrading to Logstash 1.5.2, the Kafka output plugin stopped producing messages. Oddly, no errors indicating a problem were evident (even in --debug mode). If the partition_key_format value is a string the output plugin produces messages correctly, but if the value is a date (or, I suspect a number), it does not. I suspect this has to do with the new string_interpolation functions.