Open maobaolong opened 3 months ago
Application information should be displayed in the Spark history server.
@jerqi Sorry for that I do not explain more about the motivation.
us
.If I have sparkConf
, I can get these information by
usp.param=20240315170727772_20240726000000
spark.app.name=livy-session-23219-decb9ee5-a499-469a-b8ed-e23605e8efbf
spark.livy.supersql.session_id=decb9ee5-a499-469a-b8ed-e23605e8efbf
spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES=http://xxxx/xxx
Base on these, we can get more context and know well about this job, we could dig the reason of failure or under performance.
Besides, we can get more useful information to help us for issue defeat.
spark.sql.warehouse.dir
to check the cluster performance.java version
to check whether the java version is expected.spark.internal.preload.jars
to check whether the client jar is expected.classpath
to check whether the client jar is correct.user.name
to get the actually user namespark.rss.*
to check the effected config about uniffle.spark.app.attempt.id
to know the attempt time
Code of Conduct
Search before asking
Describe the feature
We can let client send sparkConf to coordinator to show more context information about this application.
Motivation
Insight and know what application are running and being served by RSS cluster.
Describe the solution
conf
within application page.Additional context
No response
Are you willing to submit PR?