getindata / flink-http-connector

Http Connector for Apache Flink. Provides sources and sinks for Datastream , Table and SQL APIs.
Apache License 2.0
136 stars 39 forks source link

Is there support flink 1.16? #37

Closed agostop closed 1 year ago

agostop commented 1 year ago

I use flink version 1.16, and use example in README, but could not recevie any response from rest api.

The API server I used from unit test "src/test/java/com/getindata/connectors/http/app/HttpStubApp.java".

CREATE TABLE Customers (
>   id STRING,
>   id2 STRING,
>   msg STRING,
>   uuid STRING,
>   details ROW<
>     isActive BOOLEAN,
>     nestedDetails ROW<
>       balance STRING
>     >
>   >
> ) WITH (
> 'connector' = 'rest-lookup',
> 'format' = 'json',
> 'url' = 'http://192.168.0.100:8080/client',
> 'asyncPolling' = 'true'
> );
CREATE TABLE Orders (id STRING, id2 STRING, proc_time AS PROCTIME()
> ) WITH (
> 'connector' = 'datagen',
> 'rows-per-second' = '1',
> 'fields.id.kind' = 'sequence',
> 'fields.id.start' = '1',
> 'fields.id.end' = '120',
> 'fields.id2.kind' = 'sequence',
> 'fields.id2.start' = '2',
> 'fields.id2.end' = '120'
> );
SELECT o.id, o.id2, c.msg, c.uuid, c.isActive, c.balance FROM Orders AS o JOIN Customers FOR SYSTEM_TIME AS OF o.proc_time AS c ON o.id = c.id AND o.id2 = c.id2
Screen Shot 2022-11-27 at 11 46 24
kristoffSC commented 1 year ago

Hi @agostop Thanks for reporting this. Yeah it seems there were some changes in Flink 1.16 that are not backward compatible with 1.15 around table support ;/

I just ran our tests, where I just switched from 1.15 to 1.16 and some of them is failing ;/

I would have to take a closer look and eventually we would need to decide should we stick with 1.15 (we have one client that is using 1.15 and our connector) or move to 1.16 or support multi versions but without code duplication...

For now we support Flink 1.15 only.

gauravmiglanid11 commented 1 year ago

Hi Team, any update on flink 1.16 support

kristoffSC commented 1 year ago

Hi @gauravmiglanid11 currently there is no update for this issue. I'm swapped with other things unfortunately.

But maybe you or any one from the community would like to take a look at this issue. I'm talking about the problem with failing tests when bumping Flink to version 1.16. Would be great just to pin point the root cause, without starting the multi version support yet.

kristoffSC commented 1 year ago

Ok I think I know what could be the problem here. I will work on it shortly.

I think, it should be possible to use connector both for Flink 1.15 and 1.16

kristoffSC commented 1 year ago

@agostop @gauravmiglanid11 PR is ready for review. After merging this one I will release new version.

kristoffSC commented 1 year ago

Hi @agostop @gauravmiglanid11

The 0.9.0 connector version has been released and it has support for Flink 1.16.

However it seems that due to some refactoring in Flink SQL Client [1] the demo application is still not working. Seems that class loader management has been changed and we need to adjust our code to this.

This however seems to have impact only on SQL Client, you should be able to use Table/SQL API with connector without issues.

[1] https://issues.apache.org/jira/browse/FLINK-15635

agostop commented 1 year ago

Good news! Thanks @kristoffSC