logstash-plugins / logstash-input-jdbc

Logstash Plugin for JDBC Inputs
Apache License 2.0
449 stars 187 forks source link

timestamp sql_last_value will miss some record that insert at exact the same second #313

Closed hmh-the-one closed 6 years ago

hmh-the-one commented 6 years ago

I have a table that records are inserted frequently, more than 10 per second. My logstash configuration is:

schedule => " *" statement => "... update_date >:sql_last_value "

At 12:00:00, 12 rows are inserted for example, at the same time, logstash will execute once, the real statement executed may look like "... update_date >11:59:00", it means that the time span is (11:59:00, 12:00:00]. The problem is that at exact 12:00:00, more than 12 rows are inserted, some may before logstash queries, some may after, the records inserted after query then will fail to sychronize to elasticsearch forever, 'cause next time the query time span is (12:00:00, 12:01:00]. Any idea to solve this problem?

hmh-the-one commented 6 years ago

Find a working around way: update_date > :sql_last_value and update_date <= now() -30s , then the end time is always the past.