apache / iceberg

Apache Iceberg
https://iceberg.apache.org/
Apache License 2.0
6.16k stars 2.14k forks source link

FlinkSQL Upsert did'nt support timestamp column as a primary key #7707

Open norrishuang opened 1 year ago

norrishuang commented 1 year ago

Apache Iceberg version

1.1.0

Query engine

Flink

Please describe the bug 🐞

I used flinksql upsert records into a iceberg table. The primary key has a timestamp column. But it can't worked. and have a exception as below

java.lang.ArrayIndexOutOfBoundsException: 6
    at org.apache.flink.table.data.binary.BinarySegmentUtils.getLongSlowly(BinarySegmentUtils.java:744)
    at org.apache.flink.table.data.binary.BinarySegmentUtils.getLongMultiSegments(BinarySegmentUtils.java:738)
    at org.apache.flink.table.data.binary.BinarySegmentUtils.getLong(BinarySegmentUtils.java:726)
    at org.apache.flink.table.data.binary.BinarySegmentUtils.readTimestampData(BinarySegmentUtils.java:1022)
    at org.apache.flink.table.data.binary.BinaryRowData.getTimestamp(BinaryRowData.java:356)
    at org.apache.flink.table.data.RowData.lambda$createFieldGetter$39385f9c$1(RowData.java:260)
    at org.apache.flink.table.data.RowData.lambda$createFieldGetter$25774257$1(RowData.java:296)
    at org.apache.iceberg.flink.data.RowDataProjection.getValue(RowDataProjection.java:159)
    at org.apache.iceberg.flink.data.RowDataProjection.isNullAt(RowDataProjection.java:179)
    at org.apache.iceberg.flink.RowDataWrapper.get(RowDataWrapper.java:67)
    at org.apache.iceberg.types.JavaHashes$StructLikeHash.hash(JavaHashes.java:92)
    at org.apache.iceberg.types.JavaHashes$StructLikeHash.hash(JavaHashes.java:71)
    at org.apache.iceberg.util.StructLikeWrapper.hashCode(StructLikeWrapper.java:96)
    at java.util.HashMap.hash(HashMap.java:340)
    at java.util.HashMap.remove(HashMap.java:800)
    at org.apache.iceberg.util.StructLikeMap.remove(StructLikeMap.java:93)
    at org.apache.iceberg.io.BaseTaskWriter$BaseEqualityDeltaWriter.internalPosDelete(BaseTaskWriter.java:155)
    at org.apache.iceberg.io.BaseTaskWriter$BaseEqualityDeltaWriter.deleteKey(BaseTaskWriter.java:185)
    at org.apache.iceberg.flink.sink.BaseDeltaTaskWriter.write(BaseDeltaTaskWriter.java:84)
    at org.apache.iceberg.flink.sink.BaseDeltaTaskWriter.write(BaseDeltaTaskWriter.java:40)
    at org.apache.iceberg.flink.sink.IcebergStreamWriter.processElement(IcebergStreamWriter.java:72)
    at org.apache.flink.streaming.runtime.tasks.OneInputStreamTask$StreamTaskNetworkOutput.emitRecord(OneInputStreamTask.java:233)
    at org.apache.flink.streaming.runtime.io.AbstractStreamTaskNetworkInput.processElement(AbstractStreamTaskNetworkInput.java:134)
    at org.apache.flink.streaming.runtime.io.AbstractStreamTaskNetworkInput.emitNext(AbstractStreamTaskNetworkInput.java:105)
    at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65)
    at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:542)
    at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:231)
    at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:831)
    at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:780)
    at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:935)
    at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:914)
    at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:728)
    at org.apache.flink.runtime.taskmanager.Task.run(Task.java:550)
    at java.lang.Thread.run(Thread.java:750)
ConeyLiu commented 1 year ago

I think this should be fixed by #7836

github-actions[bot] commented 2 weeks ago

This issue has been automatically marked as stale because it has been open for 180 days with no activity. It will be closed in next 14 days if no further activity occurs. To permanently prevent this issue from being considered stale, add the label 'not-stale', but commenting on the issue is preferred when possible.