erdemcer / kafka-connect-oracle

Kafka Source Connector For Oracle
Apache License 2.0
349 stars 167 forks source link

Fetch size parameter disscuss. #82

Open yymoxiaochi opened 4 years ago

yymoxiaochi commented 4 years ago

Hi,Erdem As far as I know,when fetchSize > 1, but sql is 1, logminer will not be able to capture this redo sql due to DB fetch size. But, when fetchSize = 1, the number of SQL is particularly high,Connector captures very slowly. Is there any way to solve this problem? Like Kafka produce's configuration:"batch.size" and "linger.ms" . Thanks.

erdemcer commented 4 years ago

Hi, As you said , when fetch size >1 , connector waits until number of records which is equal to fetch size come from result set which is Oracle logminer view query. As i understand you would like to have ability to capture data with higher fetch size within some specific duration even number of records for fetching not acquired. Am i right ? Thanks

yymoxiaochi commented 4 years ago

yes, you are right. If there only have 10 records and fetch size is 100, I want to be able to fetch the 10 records immediately, or return the data currently captured before the fetch size reaches the specified time.

yymoxiaochi commented 4 years ago

Hi, is there any solution for this ?

erdemcer commented 4 years ago

Hi, It can be possible , but it requires development of course . I am planning some details to achieve this. Thanks.

yymoxiaochi commented 4 years ago

Is there any general direction? In my test, if DBMS_logmnr.start_logmnr start with endScn, I could capture all the SQL in this range.

erdemcer commented 4 years ago

What do yo mean by general direction ?

yymoxiaochi commented 4 years ago

"I am planning some details to achieve this.", Could you tell me in what way you intend to solve this problem? Overridden some of the oralce JDBC implementation methods? :)

erdemcer commented 4 years ago

Not actually. I am planning to add some timer which controls specified duration and propagates data which are not sent to Kafka topic within JDBC fetch process. Thanks.

yymoxiaochi commented 4 years ago

Get it ,thanks, Look forward to your next update~~~

taotaizhu-pw commented 2 years ago

Get it ,thanks, Look forward to your next update~~~

Any update?

taotaizhu-pw commented 2 years ago

Not actually. I am planning to add some timer which controls specified duration and propagates data which are not sent to Kafka topic within JDBC fetch process. Thanks.

Any update about it? I found the performance is not good to fetch the data one by one