apache / seatunnel

SeaTunnel is a next-generation super high-performance, distributed, massive data integration tool.
https://seatunnel.apache.org/
Apache License 2.0
8.04k stars 1.82k forks source link

[Bug] [FLink] [Streaming] not supported void Type #4345

Open laglangyue opened 1 year ago

laglangyue commented 1 year ago

Search before asking

What happened

when I run flink example use with the conf, it exception

    schema = {
      fields {
        c_map = "map<string, string>"
        c_array = "array<int>"
        c_string = string
        c_boolean = boolean
        c_tinyint = tinyint
        c_smallint = smallint
        c_int = int
        c_bigint = bigint
        c_float = float
        c_double = double
        c_decimal = "decimal(30, 8)"
        c_null = "null"
        c_bytes = bytes
        c_date = date
        c_timestamp = timestamp
      }
    }

image

SeaTunnel Version

dev

SeaTunnel Config

#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
######
###### This config file is a demonstration of streaming processing in seatunnel config
######

env {
  # You can set flink configuration here
  execution.parallelism = 2
  job.mode = "STREAMING"
  #execution.checkpoint.interval = 10000
  #execution.checkpoint.data-uri = "hdfs://localhost:9000/checkpoint"
}

source {
  # This is a example source plugin **only for test and demonstrate the feature source plugin**
  FakeSource {
    parallelism = 2
    result_table_name = "fake"
    row.num = 16
    schema = {
      fields {
        c_map = "map<string, string>"
        c_array = "array<int>"
        c_string = string
        c_boolean = boolean
        c_tinyint = tinyint
        c_smallint = smallint
        c_int = int
        c_bigint = bigint
        c_float = float
        c_double = double
        c_decimal = "decimal(30, 8)"
        c_null = "null"
        c_bytes = bytes
        c_date = date
        c_timestamp = timestamp
      }
    }
  }

  # If you would like to get more information about how to configure seatunnel and see full list of source plugins,
  # please go to https://seatunnel.apache.org/docs/category/source-v2
}

transform {
  # split data by specific delimiter

  # you can also use other transform plugins, such as sql
  sql {
    source_table_name = "fake"
    query = "select c_map,c_array,c_string,c_boolean,c_tinyint,c_smallint,c_int,c_bigint,c_float,c_double,c_null,c_bytes,c_date,c_timestamp from fake"
    result_table_name = "sql"
  }

  # If you would like to get more information about how to configure seatunnel and see full list of transform plugins,
  # please go to https://seatunnel.apache.org/docs/category/transform
}

sink {
  Console {
    parallelism = 3
  }

  # If you would like to get more information about how to configure seatunnel and see full list of sink plugins,
  # please go to https://seatunnel.apache.org/docs/category/sink-v2
}

Running Command

nothing

Error Exception

Caused by: org.apache.flink.table.api.ValidationException: Column types of query result and sink for unregistered table do not match.
Cause: Incompatible types for sink column 'c_null' at position 11.

Query schema: [c_map: MAP<STRING, STRING>, c_array: ARRAY<INT>, c_string: STRING, c_boolean: BOOLEAN, c_tinyint: TINYINT, c_smallint: SMALLINT, c_int: INT, c_bigint: BIGINT, c_float: FLOAT, c_double: DOUBLE, c_decimal: DECIMAL(38, 18), c_null: RAW('java.lang.Void', '...'), c_bytes: BYTES, c_date: DATE, c_timestamp: TIMESTAMP(9)]
Sink schema:  [c_map: MAP<STRING, STRING>, c_array: ARRAY<INT>, c_string: STRING, c_boolean: BOOLEAN, c_tinyint: TINYINT, c_smallint: SMALLINT, c_int: INT, c_bigint: BIGINT, c_float: FLOAT, c_double: DOUBLE, c_decimal: DECIMAL(38, 18), c_null: RAW('java.lang.Void', ?), c_bytes: BYTES, c_date: DATE, c_timestamp: TIMESTAMP(3)]
    at org.apache.flink.table.planner.connectors.DynamicSinkUtils.createSchemaMismatchException(DynamicSinkUtils.java:432)
    at org.apache.flink.table.planner.connectors.DynamicSinkUtils.validateSchemaAndApplyImplicitCast(DynamicSinkUtils.java:255)
    at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:255)
    at org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:162)
    at org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:162)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)

Flink or Spark Version

examplt

Java or Scala Version

No response

Screenshots

No response

Are you willing to submit PR?

Code of Conduct

laglangyue commented 1 year ago

I am not similar with flink. anyone can fix it

CodingGPT commented 1 year ago

I can try this

CodingGPT commented 1 year ago

why use void type? what type void type should be trasformed ?

CypressXiao commented 1 year ago

Do you fix it?I meet the same problem.I want to ask for help.

laglangyue commented 1 year ago

Do you fix it?I meet the same problem.I want to ask for help.

what happend? I am pool with Flink, so are you willing to do this?

CheneyYin commented 9 months ago

I also encountered this problem. I don't know much about flink either.
This should be caused by an internal defect in flink. Conversions between flink TypeInformation and flink DataType conversion cause it. Some related conversions do not implement complete type handling and consistency, and are currently marked as deprecated.

CheneyYin commented 9 months ago

@Carl-Zhou-CN PTAL

yangsir666 commented 2 months ago

所以咋解决啊啊啊啊啊啊啊 啊啊啊

Carl-Zhou-CN commented 2 months ago

@Carl-Zhou-CN PTAL

Sorry, I missed that. @CheneyYin

Carl-Zhou-CN commented 2 months ago

所以咋解决啊啊啊啊啊啊啊 啊啊啊

hi, @yangsir666 What version of Seatunel are you using?

CheneyYin commented 2 months ago

@Carl-Zhou-CN PTAL

Sorry, I missed that. @CheneyYin

6277 maybe helpful.