apache / linkis

Apache Linkis builds a computation middleware layer to facilitate connection, governance and orchestration between the upper applications and the underlying data engines.
https://linkis.apache.org/
Apache License 2.0
3.3k stars 1.17k forks source link

[Question] spark任务提交到yarn,一直是Running状态 #4943

Closed lishengming6 closed 11 months ago

lishengming6 commented 11 months ago

Before asking

Your environment

Describe your questions

view组件执行sql,spark引擎提交任务到yarn,yarn任务一直在running,前台数据是返回的 疑问:spark引擎的任务到yarn是常驻的吗?我kill掉引擎就发现yarn任务是finished的,有什么方法可以让spark提交到yarn任务执行完状态变成finished吗? image

image

Eureka service list

eg:image

Some logs info or acctch file

linkis-xxx.log:


<!--日志文字贴到这里-->

log file:

linkis-xxx.log

github-actions[bot] commented 11 months ago

:blush: Welcome to the Apache Linkis community!!

We are glad that you are contributing by opening this issue.

Please make sure to include all the relevant context. We will be here shortly.

If you are interested in contributing to our website project, please let us know! You can check out our contributing guide on :point_right: How to Participate in Project Contribution.

Community

WeChat Assistant WeChat Public Account

Mailing Lists

Name Description Subscribe Unsubscribe Archive
dev@linkis.apache.org community activity information subscribe unsubscribe archive
casionone commented 11 months ago

Check whether the yarn service is ok You can run the spark-sql command on the machine to see if the task submitted to yarn is normal.

peacewong commented 11 months ago

Hello, this is because the Spark engine is reused, so spark is released when idle. You can add the label executeOnce so that the Spark session will exit after executing the task.

lishengming6 commented 11 months ago

Is linkis 1.3.1 supported? And Where can I add the label?