Closed JingsongLi closed 1 year ago
@JingsongLi Could you assign this issue to me?
It look like managed table related methods had been Removed in LogStoreTableFactory. commit Do we have other plans?
I have a similar bug issue 815 When there are multiple buckets, the exception caused by the number of kafka partitions being 1.
As zhuangchong said in this issue. Do we need to manage tables in LogStoreTableFactory?
@calvinjiang Feel free to create a PR
@calvinjiang Feel free to create a PR
Ok, I'll fix this issue.
Search before asking
Paimon version
0.4
Compute Engine
flink
Minimal reproduce step
If I use Kafka as the log system and set auto.create.topics.enabled to false in Kafka INSERTs do not work.
Steps to reproduce:
Start a Kafka broker and set auto.create.topics.enabled to false Issue the following statements
What doesn't meet your expectations?
The task manager logs show:
The INSERT job on the task manager fails with
Apparently, the Kafka topic is created when the first record is written to the Kafka topic, although I found code to create a Kafka topic explicitly on table creation: https://github.com/apache/flink-table-store/blob/f201b507fef88501c4beb4c62807bef818e31be5/flink-table-store-kafka/src/main/java/org/apache/flink/table/store/kafka/KafkaLogStoreFactory.java#L123
Topic creation should not rely on enabling auto topic creation in Kafka, because users might opt to disable auto topic creation to prevent unexpected costs when a fully-managed Kafka service is used. For example see the Confluent Cloud documentation: https://docs.confluent.io/cloud/current/clusters/broker-config.html#enable-automatic-topic-creation
IMO, when a table is created, the corresponding Kafka topic should be explicitly created.
Anything else?
No response
Are you willing to submit a PR?