Closed Sugariscool closed 1 year ago
and I also want to know: how cast structure MAP<string,string> to structure MAP<int,double> in sql or using JAVA/Scala?
There is a bug in iceberg flink. Refer to https://issues.apache.org/jira/browse/FLINK-21247 And the solution of community is to fix it in flink. So it's recommended to bump flink to 1.14.5 or 1.15
and I also want to know: how cast structure MAP<string,string> to structure MAP<int,double> in sql or using JAVA/Scala?
For the specified case, you can cast the data by user defined function. You can also send email to flink user mail(user@flink.apache.org) for more help.
What happened?
I want to transfer kafka‘s data to arctic. The structure of the data is similar to { "a":"123456","b":"123","t":"1667805600","d":{"d0":"1.0","d1":"-6","d2":"0",...,"dn":"197"}}, which contains some strings and one map. I use kafka to be as a source and I want it to be stored in arctic. The sink table is similar to source as follower:
create table IF NOT EXISTS Arctic.db.test( p string, e string, t string, d MAP<string, string>, primary key(t) not enforced )with( 'key'=' value'
and executeINSERT INTO Arctic.db.test SELECT p,e,t, d FROM kafka_source;
when I select data from table test, just likeselect * from Arctic.db.test;
There are errors:com.netease.arctic.flink.read.AdaptHiveFlinkParquetReaders$ReusableMapData cannot be cast to org.apache.flink.table.data.binary.BinaryMapData
Affects Versions
arctic-0.3.2
What engines are you seeing the problem on?
No response
How to reproduce
data structure: { "p":"123456","e":"123","t":"1667805600","d":{"d0":"1.0","d1":"-6","d2":"0",...,"dn":"197"}}
--create kafka source create TABLE kafka_source( p string, e string, t string, d MAP<string,string> )WITH( 'connector' = 'kafka', 'topic' = 'test', 'properties.bootstrap.servers' = 'localhost:9092', 'properties.group.id' = 'lakehouse-consumer-sql', 'scan.startup.mode' = 'latest-offset', 'format' = 'json', 'json.fail-on-missing-field' = 'true', 'json.ignore-parse-errors' = 'false' ); -- open the table dynamic conf HINT SET table.dynamic-table-options.enabled=true; -- have a show select p,e,t,d from kafka_source; --create catalog create CATALOG Arctic WITH ( 'type' = 'arctic', 'metastore.url'='thrift://localhost:1260/arctic_catalog' ); --create sink create DATABASE IF NOT EXISTS Arctic.db; create table IF NOT EXISTS Arctic.db.test( p string, e string, t string, d MAP<string,string>, primary key(t) not enforced )with( 'key'='value' ); --transfer data from source to sink INSERT INTO Arctic.db.test SELECT p,e,t,d FROM kafka_source; --have a look select * from Arctic.db.test;
Relevant log output
Anything else
No response
Code of Conduct