Open wxd5146 opened 9 months ago
Hello @wxd5146, Thanks for finding the time to report the issue! We really appreciate the community's efforts to improve Apache Kyuubi.
the failed task log:
It's a known issue of Spark, Kyuubi does not change anything.
i use spark thriftserver,there is no this issue
i use spark 3.3.2 and kybbui 1.7.3 or 1.8.0
have you compared the driver's log for Kyuubi and STS, both of them call spark.sql(xxx)
to run queries, no additional actions.
Code of Conduct
Search before asking
Describe the bug
Pre-Description: 1、run sql on spark engine 2、spark engine write and read from hive 3、create hive table as orc or parquet
when i execute sql like “insert overwrite table tpch.ods_tpch_lineitem_d_1 select distinct * from tpch.ods_tpch_lineitem_d;” i check the hdfs path,find that the kyuubi spark engine will delete the hdfs path and will create the path at the last dag stage
but,there is a issue, if the sql run failed and do not reach the last stage,then the hdfs path of hive table will lost. when i close the spark session and open a new session, run sql “insert overwrite table tpch.ods_tpch_lineitem_d_1 select distinct * from tpch.ods_tpch_lineitem_d;” it will can not find the hdfs path and the task will failed.
Affects Version(s)
1.7.3/1.8.0
Kyuubi Server Log Output
No response
Kyuubi Engine Log Output
No response
Kyuubi Server Configurations
No response
Kyuubi Engine Configurations
No response
Additional context
No response
Are you willing to submit PR?