confluentinc / kafka-connect-jdbc

Kafka Connect connector for JDBC-compatible databases
Other
19 stars 956 forks source link

Numeric Overflow when using best_fit matching with NUMERIC with precision: '0' and scale: '0' #1066

Closed cstmgl closed 3 years ago

cstmgl commented 3 years ago

I think my problem is similar with NUMERIC with https://github.com/confluentinc/kafka-connect-jdbc/issues/480 I'm just using oracle instead of JDBC.

I'm using version 5.5.2 and I get:

java.sql.SQLException: Numeric Overflow
io.confluent.connect.jdbc.dialect.GenericDatabaseDialect.lambda$columnConverterFor$18(GenericDatabaseDialect.java:1276)

I put the log level in debug and I get:

"message":"NUMERIC with precision: '0' and scale: '0'","class":"io.confluent.connect.jdbc.dialect.GenericDatabaseDialect"

Is there any way around this?

my connector configs looks something like this:

    mode: bulk
    query:  SELECT ORDER_TYPE_ID, COUNT (*) AS NUM_RECORDS FROM DOC GROUP BY ORDER_TYPE_ID
    schema.pattern: K
    numeric.mapping: best_fit

It's a rather simple connector and I really just want to have a count of records of a certain type but I don't understand why this fails

atrbgithub commented 3 years ago

@cstmgl was you able to resolve this?

We've just upgraded from Confluent 5.5.0(with ojdbc7) to Confluent 6.2.0(with ojdbc8) and are now seeing this.

cstmgl commented 3 years ago

I think this is per design so I had to write my own dialect to handle this.