Closed Aload closed 2 years ago
when i use v0.3.2-patch7.A different set of errors resurfaced exg:
Caused by: java.lang.RuntimeException: Writing records to JDBC failed. at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.checkFlushException(JdbcBatchingOutputFormat.java:153) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.flush(JdbcBatchingOutputFormat.java:179) at org.apache.flink.connector.jdbc.internal.GenericJdbcSinkFunction.snapshotState(GenericJdbcSinkFunction.java:62) at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.trySnapshotFunctionState(StreamingFunctionUtils.java:118) at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.snapshotFunctionState(StreamingFunctionUtils.java:99) at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.snapshotState(AbstractUdfStreamOperator.java:89) at org.apache.flink.streaming.api.operators.StreamOperatorStateHandler.snapshotState(StreamOperatorStateHandler.java:218) ... 23 more Caused by: java.lang.NullPointerException at com.clickhouse.client.data.BinaryStreamUtils.writeString(BinaryStreamUtils.java:1661) at com.clickhouse.client.data.ClickHouseRowBinaryProcessor$MappedFunctions.lambda$buildMappingsForDataTypes$65(ClickHouseRowBinaryProcessor.java:338) at com.clickhouse.client.data.ClickHouseRowBinaryProcessor$MappedFunctions.serialize(ClickHouseRowBinaryProcessor.java:485) at com.clickhouse.jdbc.internal.InputBasedPreparedStatement.addBatch(InputBasedPreparedStatement.java:295) at org.apache.flink.connector.jdbc.internal.executor.SimpleBatchStatementExecutor.executeBatch(SimpleBatchStatementExecutor.java:71) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.attemptFlush(JdbcBatchingOutputFormat.java:213) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.flush(JdbcBatchingOutputFormat.java:183) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.lambda$open$0(JdbcBatchingOutputFormat.java:127) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ... 1 more
Hi @Aload, the issue happens when you're trying to update a non-array column using an array, for example: trying to update a String
column using new String[] {"E5", "E6"}
.
Would you mind to share the table structure(especially the column you failed to update using array)? Also it would be very helpful if you can provide code snippet for reproducing the issue.
Update: please refer to test case like this if you need to deal with array, or maybe this for complex data type(e.g. array of map etc.).
get 。but when I coded in Scala, I couldn't achieve the desired effect .
pst.setArray(3, pst.getConnection.createArrayOf("String", Array[String]("3", null, "1")))
Hi @Aload, the issue happens when you're trying to update a non-array column using an array, for example: trying to update a
String
column usingnew String[] {"E5", "E6"}
.Would you mind to share the table structure(especially the column you failed to update using array)? Also it would be very helpful if you can provide code snippet for reproducing the issue.
Update: please refer to test case like this if you need to deal with array, or maybe this for complex data type(e.g. array of map etc.).
When I coded in Scala, I couldn't achieve the desired effect
The same problem will still occur when I change it to what you said.
pst.setArray(20, pst.getConnection().createArrayOf("String", new String[]{"3", null, "1"}));
Caused by: java.lang.RuntimeException: Writing records to JDBC failed. at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.checkFlushException(JdbcBatchingOutputFormat.java:153) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.flush(JdbcBatchingOutputFormat.java:179) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.close(JdbcBatchingOutputFormat.java:229) ... 11 more Caused by: java.lang.IllegalArgumentException: Only singleton array is allowed, but we got: [3, null, 1] at com.clickhouse.client.ClickHouseValue.update(ClickHouseValue.java:1099) at com.clickhouse.jdbc.ClickHouseConnection.createArrayOf(ClickHouseConnection.java:40) at com.clickhouse.jdbc.ClickHouseConnection.createArrayOf(ClickHouseConnection.java:23) at com.anso.process.function.JdbcCkStatementBuilders.accept(JdbcCkStatementBuilders.java:37) at com.anso.process.function.JdbcCkStatementBuilders.accept(JdbcCkStatementBuilders.java:14) at org.apache.flink.connector.jdbc.internal.executor.SimpleBatchStatementExecutor.executeBatch(SimpleBatchStatementExecutor.java:70) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.attemptFlush(JdbcBatchingOutputFormat.java:213) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.flush(JdbcBatchingOutputFormat.java:183) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.lambda$open$0(JdbcBatchingOutputFormat.java:127) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
The exception is irrelevant to your initial question or array. It looks like you passed null
to a non-nullable String column.
I run into the same error since version 0.3.2-patch6.
Up to version 0.3.2-patch5 everything works fine.
I poked around an bit and found out that the jdbc-driver has problems with empty strings. It seems that it handles empty strings like null values.
Before patch6 everything worked fine. This behavior still exists in patch9
nullAsDefault
option was added in 0.3.2-patch10 - see comments at here.
Ok TKS .this error is okay.
发自我的iPhone
在 2022年4月15日,上午9:30,Zhichun Wu @.***> 写道:
The exception is irrelevant to your initial question or array. It looks like you passed null to a non-nullable String column.
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.
When I use v0.3.2 , val array = pst.getConnection .createArrayOf(ClickHouseDataType.String.name(), ClickHouseArrayValue.of(intoMsg.withArrayJsonNode.elements().asScala.toArray).asArray()) pst.setArray(20, array) //告警 pst.setInt(21, intoMsg.get("isInterpolate").asInt) //是否插补(0正常||1插补)
this update must size=0?
Caused by: java.lang.RuntimeException: Writing records to JDBC failed. at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.checkFlushException(JdbcBatchingOutputFormat.java:153) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.flush(JdbcBatchingOutputFormat.java:179) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.close(JdbcBatchingOutputFormat.java:229) ... 11 more Caused by: java.lang.IllegalArgumentException: Only singleton array is allowed, but we got: ["E5", "E6"] at com.clickhouse.client.ClickHouseValue.update(ClickHouseValue.java:1148) at com.clickhouse.jdbc.ClickHouseConnection.createArrayOf(ClickHouseConnection.java:40) at com.clickhouse.jdbc.ClickHouseConnection.createArrayOf(ClickHouseConnection.java:23) at com.anso.process.function.JdbcCkStatementBuilder.accept(JdbcCkStatementBuilder.scala:55) at com.anso.process.function.JdbcCkStatementBuilder.accept(JdbcCkStatementBuilder.scala:21) at org.apache.flink.connector.jdbc.internal.executor.SimpleBatchStatementExecutor.executeBatch(SimpleBatchStatementExecutor.java:70) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.attemptFlush(JdbcBatchingOutputFormat.java:213) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.flush(JdbcBatchingOutputFormat.java:183) at org.apache.flink.connector.jdbc.internal.JdbcBatchingOutputFormat.lambda$open$0(JdbcBatchingOutputFormat.java:127) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)