Closed lishengming6 closed 11 months ago
We are glad that you are contributing by opening this issue.
Please make sure to include all the relevant context. We will be here shortly.
If you are interested in contributing to our website project, please let us know! You can check out our contributing guide on :point_right: How to Participate in Project Contribution.
WeChat Assistant | WeChat Public Account |
---|---|
Name | Description | Subscribe | Unsubscribe | Archive |
---|---|---|---|---|
dev@linkis.apache.org | community activity information | subscribe | unsubscribe | archive |
Check whether the yarn service is ok You can run the spark-sql command on the machine to see if the task submitted to yarn is normal.
Hello, this is because the Spark engine is reused, so spark is released when idle. You can add the label executeOnce so that the Spark session will exit after executing the task.
Is linkis 1.3.1 supported? And Where can I add the label?
Before asking
Your environment
Describe your questions
view组件执行sql,spark引擎提交任务到yarn,yarn任务一直在running,前台数据是返回的 疑问:spark引擎的任务到yarn是常驻的吗?我kill掉引擎就发现yarn任务是finished的,有什么方法可以让spark提交到yarn任务执行完状态变成finished吗?
Eureka service list
eg:
Some logs info or acctch file
linkis-xxx.log:
log file:
linkis-xxx.log