Open smeana opened 2 years ago
Hello,
Hello
I am getting message.max.bytes error when trying to read logs files (pain text) of 300MB. As far as I know, the connector moves the whole file in one record to kafka.
I haven't really tried it, but I'd look at using a converter for splitting the file.
Is any option to split the file in mini batches and have an atomic transaction. In case the connector fails in the middle of > the processing, reprocess only from where it left?
Not yet. I am implementing that feature in Camel Core as part of CAMEL-15562. It's progressing, and should be in Core in a few versions if everything goes alright.
Regards
Hello,
Hello I am getting message.max.bytes error when trying to read logs files (pain text) of 300MB. As far as I know, the connector moves the whole file in one record to kafka.
I haven't really tried it, but I'd look at using a converter for splitting the file.
Is any option to split the file in mini batches and have an atomic transaction. In case the connector fails in the middle of > the processing, reprocess only from where it left?
Not yet. I am implementing that feature in Camel Core as part of CAMEL-15562. It's progressing, and should be in Core in a few versions if everything goes alright.
Regards
Thanks @orpiske I was expecting the connector to do that, as files use to be bigger than kafka message.max.bytes
Hello
I am getting message.max.bytes error when trying to read logs files (pain text) of 300MB. As far as I know, the connector moves the whole file in one record to kafka. Is any option to split the file in mini batches and have an atomic transaction. In case the connector fails in the middle of the processing, reprocess only from where it left?
Regards