ywilkof / spark-jobs-rest-client

Fluent client for interacting with Spark Standalone Mode's Rest API for submitting, killing and monitoring the state of jobs.
Apache License 2.0
108 stars 58 forks source link

How to set the resource of the spark cluster when submit a spark job? #27

Open guoyuhaoaaa opened 5 years ago

guoyuhaoaaa commented 5 years ago

When I submit a spark job like this spark-submit --class org.apache.spark.examples.SparkPi --driver-memory 1g --executor-memory 1g --executor-cores 1 --queue thequeue examples/target/scala-2.11/jars/spark-examples*.jar 10

What is the API of the settings of "--executor-memory 1g --executor-cores 1" in the spark-jobs-rest-client ?? Can you give me a example??

ywilkof commented 5 years ago

Hi, give the following parameters in the environment map that the client accepts at creation time: spark.executor.memory and spark.executor.cores

guoyuhaoaaa commented 5 years ago

Thanks for your response, I have solved that problem. Now I meet a new question, How to get the println message of the driver when run a spark job. Just like " JobStatusResponse jobStatus = client .checkJobStatus() .withSubmissionIdFullResponse(submissionId);"??? Look forward to your response