StarRocks / starrocks

The world's fastest open query engine for sub-second analytics both on and off the data lakehouse. With the flexibility to support nearly any scenario, StarRocks provides best-in-class performance for multi-dimensional analytics, real-time analytics, and ad-hoc queries. A Linux Foundation project.
https://starrocks.io
Apache License 2.0
9.19k stars 1.83k forks source link

spark load fail when use hudi table for origin table #45317

Closed blanklin030 closed 1 day ago

blanklin030 commented 6 months ago

Steps to reproduce the behavior (Required)

LOAD LABEL db.linyu_20240424_hudi2sr (
DATA FROM TABLE hudi_table
INTO TABLE linyu001_hudi_2_sr_test
TEMPORARY PARTITION(temp__p20210102)
SET (
`_hoodie_commit_time` = `_hoodie_commit_time`,
 `_hoodie_commit_seqno` = `_hoodie_commit_seqno`, `_hoodie_commit_seqno`),
 `_hoodie_record_key` =`_hoodie_record_key`,
 `_hoodie_partition_path` = `_hoodie_partition_path`,
 `_hoodie_file_name` = `_hoodie_file_name`,
 `dt` = `dt`,
 `id` = `id`,
 `name` =  `name`,
 `age` =  `age`,
 `ts` = `ts`
) 
WHERE (`dt` = '20210102')
)WITH RESOURCE 'xxxxx' (
  "spark.yarn.tags" = "h2s_foit_linyu20240422abc001",
  "spark.dynamicAllocation.enabled" = "true",
  "spark.executor.memory" = "3g",
  "spark.executor.memoryOverhead" = "2g",
  "spark.streaming.batchDuration" = "5",
  "spark.executor.cores" = "1",
  "spark.yarn.executor.memoryOverhead" = "4g",
  "spark.speculation" = "false",
  "spark.dynamicAllocation.minExecutors" = "10",
  "spark.dynamicAllocation.maxExecutors" = "100"
) PROPERTIES (
  "timeout" = "72000",
  "spark_load_submit_timeout" = "36000"
)

Expected behavior (Required)

return load success

Real behavior (Required)

Unexpected exception: Source table hudi_table is not HiveTable ### The error may exist in

StarRocks version (Required)

2.5.12

github-actions[bot] commented 2 weeks ago

We have marked this issue as stale because it has been inactive for 6 months. If this issue is still relevant, removing the stale label or adding a comment will keep it active. Otherwise, we'll close it in 10 days to keep the issue queue tidy. Thank you for your contribution to StarRocks!