vesoft-inc / nebula-flink-connector

Flink Connector for Nebula Graph
48 stars 30 forks source link

add support to flink 1.14 #52

Closed JimWen closed 2 years ago

JimWen commented 2 years ago

with compatible support to filnk before 1.14 . if you want to use flink before 1.14 , use maven with profile flink-before-1.14.

Nicole00 commented 2 years ago

If we just change the flink's version, then can nebula-flink-connector snapshot support flink 1.11 ?

JimWen commented 2 years ago

If we just change the flink's version, then can nebula-flink-connector snapshot support flink 1.11 ?

Before flink 1.14, if you want to run the Table API & SQL programs locally within your IDE, you should use maven with profile flink-before-1.14 which will include module flink-table-planner-blink .

Other way, just change the flink's version is enough.

Nicole00 commented 2 years ago

If we just change the flink's version, then can nebula-flink-connector snapshot support flink 1.11 ?

Before flink 1.14, if you want to run the Table API & SQL programs locally within your IDE, you should use maven with profile flink-before-1.14 which will include module flink-table-planner-blink .

Other way, just change the flink's version is enough.

For snapshot version, it still just supports flink1.11. This pr is very useful for users who want to compile and package it from source code. Or maybe we can add some workflow to compile two nebula-flink-connector with different flink version. refer exchange which is compatible with different spark and scala.

JimWen commented 2 years ago

If we just change the flink's version, then can nebula-flink-connector snapshot support flink 1.11 ?

Before flink 1.14, if you want to run the Table API & SQL programs locally within your IDE, you should use maven with profile flink-before-1.14 which will include module flink-table-planner-blink . Other way, just change the flink's version is enough.

For snapshot version, it still just supports flink1.11. This pr is very useful for users who want to compile and package it from source code. Or maybe we can add some workflow to compile two nebula-flink-connector with different flink version. refer exchange which is compatible with different spark and scala.

Providing different versions of compiled products is a common practice in the flink community. Different flink version may have interface and implementation changes, and compatibility is not guaranteed.

Nicole00 commented 2 years ago

If we just change the flink's version, then can nebula-flink-connector snapshot support flink 1.11 ?

Before flink 1.14, if you want to run the Table API & SQL programs locally within your IDE, you should use maven with profile flink-before-1.14 which will include module flink-table-planner-blink . Other way, just change the flink's version is enough.

For snapshot version, it still just supports flink1.11. This pr is very useful for users who want to compile and package it from source code. Or maybe we can add some workflow to compile two nebula-flink-connector with different flink version. refer exchange which is compatible with different spark and scala.

Providing different versions of compiled products is a common practice in the flink community. Different flink version may have interface and implementation changes, and compatibility is not guaranteed.

Yeah, different flink versions have incompatible interfaces, that's why we need to provide another directory to support other flink version at the code level. And for flink connector, if we support different flink versions, maybe we should give an available dependency in maven for both flink 1.14 and other version, not just a compile product.