Open am-beta opened 1 year ago
The offending section appears to be this 'fallback' feature that is enabled by default and can be disabled by setting enable_fallback false
in your config.
https://github.com/fluent/fluent-plugin-sql/blob/master/lib/fluent/plugin/out_sql.rb#L105-L136
What this feature is doing, is for certain types of database error it will change from batch processing into processing messages one by one, and if further SQL errors happen it will just drop the message.
This should probably be disabled by default, or changed so it doesnt just dump messages if it gets an odd response from postgres, because this section will throw if you deliberately make your database read only, or restart the database.
I have an input file containing about 400k lines, one JSON entry per line, read using the
tail
input, and I am using SQL output into Postgres. In normal conditions all entries are properly inserted.However I need to make sure that a sudden stop of PostgreSQL will not lead to a loss of records by Fluentd and that these records will be retried. When I tried stopping PostgreSQL at a random moment during the insertion, about 9k entries were dumped because the plugin classified these errors as "deterministic errors":
and 9k more
Got deterministic error again
messages with a dump of the record. After that, the Postgres connection error was detected and the plugin retried as it should until Postgres comes back up, and the rest of the records are inserted but the dumped ones are lost.The behaviour I expect is that no records should be dumped in case Postgres is temporarily stopped.
This is my configuration:
Packages versions: