Closed 13535048320 closed 7 months ago
I opened a PR to add case-insensitive lookups of columns during record conversion, to handle this case where the column name cases between the Kafka record and Iceberg schema don't match.
The fix has been merged, feel free to reopen this if your issue is not resolved in the latest release.
When I create a table using Spark-SQL, the column names are all uppercase letters, and the connector runs normally. However, when using Trino to create a table, the column names are all lowercase letters, and the connector will throw an exception, and all data is null.
Table demo created using trino.
Table ods_zpp004 created using spark sql. This is running normally.
"iceberg.tables.default-id-columns": "VBELN"![image](https://github.com/tabular-io/iceberg-kafka-connect/assets/44965565/48ab8144-c656-4df8-a4bc-81c4d10c36d3)