apache / kyuubi

Apache Kyuubi is a distributed and multi-tenant gateway to provide serverless SQL on data warehouses and lakehouses.
https://kyuubi.apache.org/
Apache License 2.0
2.01k stars 880 forks source link

[Umbrella] Flink Engine Improvement and Quality Assurance #2100

Open yaooqinn opened 2 years ago

yaooqinn commented 2 years ago

Code of Conduct

Search before asking

Describe the proposal

We introduced the Flink engine in https://github.com/apache/incubator-kyuubi/issues/1322.

In this ticket, we collect feedback, improvements, bugfixes, aim to make it production-ready

Task list

Bugs

Improvement

Documentations

Brainstorming

Miscs

Are you willing to submit PR?

SteNicholas commented 2 years ago

@yaooqinn, the module label should be flink, not hive.

yaooqinn commented 2 years ago

@yaooqinn, the module label should be flink, not hive.

oops..

link3280 commented 2 years ago

@yaooqinn shall we make this a KPIP and let the corresponding issues follow the naming pattern like [SUBTASK][KPIP-X]?

yaooqinn commented 2 years ago

I am not sure that we can propose a KPIP on the status of this ticket, which seems not to meet the requirement of a KPIP.

In fact, we shall not create subtasks for KPIP-2 as it has been resolved. [SUBTASK][#2100] may be enough?

link3280 commented 2 years ago

@yaooqinn LGTM

pan3793 commented 1 year ago

Postpone to 1.8, because this feature is not under rapid development, and it's not supposed to be accomplished in a short time.

waywtdcc commented 1 year ago

The jdbc interface supports asynchronous real-time tasks to obtain results. Can this be done? @pan3793

pan3793 commented 1 year ago

@waywtdcc technically, I don't think there is any blocker in Kyuubi framework, the JDBC driver retrieves result from Kyuubi Server in mini-batch, and we do similar thing in Spark which called incremental collection.

So it could be true if the Flink engine can return the streaming data in an Iterator.

cc the Flink experts @SteNicholas @link3280 @yanghua

pan3793 commented 1 year ago

@waywtdcc are you using Flink 1.14? Actually, the Kyuubi community is going to add support for Flink 1.17 and drop support for Flink 1.14, because of the lack of developer resources.

It would be great if you can share more about your use case / challenge / expectation on Kyuubi Flink egnine :)

waywtdcc commented 1 year ago

@waywtdcc are you using Flink 1.14? Actually, the Kyuubi community is going to add support for Flink 1.17 and drop support for Flink 1.14, because of the lack of developer resources.

It would be great if you can share more about your use case / challenge / expectation on Kyuubi Flink egnine :)

We use flink1.14 for data synchronization and real-time computing

waywtdcc commented 1 year ago

@waywtdcc technically, I don't think there is any blocker in Kyuubi framework, the JDBC driver retrieves result from Kyuubi Server in mini-batch, and we do similar thing in Spark which called incremental collection.

So it could be true if the Flink engine can return the streaming data in an Iterator.

cc the Flink experts @SteNicholas @link3280 @yanghua

Ok, I see. So what if I need to get the historical checkpoint list and stop after executing the savepoint operation?

pan3793 commented 1 year ago

All things you need to do is construct a proper FetchIterator on the Flink engine side.

link3280 commented 1 year ago

@waywtdcc technically, I don't think there is any blocker in Kyuubi framework, the JDBC driver retrieves result from Kyuubi Server in mini-batch, and we do similar thing in Spark which called incremental collection. So it could be true if the Flink engine can return the streaming data in an Iterator. cc the Flink experts @SteNicholas @link3280 @yanghua

Ok, I see. So what if I need to get the historical checkpoint list and stop after executing the savepoint operation?

@waywtdcc There're on-going efforts on Flink to improve the savepoint management via SQLs (see FLIP-222 for details). Kyuubi will support these statements once they are available.

waywtdcc commented 1 year ago

Add a jar package, how to execute a certain method of this jar package?

waywtdcc commented 1 year ago

All things you need to do is construct a proper FetchIterator on the Flink engine side.

Yes, we also need to get the resulting data in a streaming manner.