Closed tyyzqmf closed 7 months ago
solved by add spark.driver.bindAddress=127.0.0.1 spark.driver.host=127.0.0.1
@tyyzqmf i see from the issue description, you already have set this config. Do you need to set this elsewhere ?
I’m facing a similar issue where I have the conf set to bind address to local but still getting the error
thanks
@tyyzqmf i see from the issue description, you already have set this config. Do you need to set this elsewhere ?
I’m facing a similar issue where I have the conf set to bind address to local but still getting the error
thanks
My mistake. I have been solving this problem by adding a new config:
spark.driver.host=127.0.0.1
@tyyzqmf i see from the issue description, you already have set this config. Do you need to set this elsewhere ? I’m facing a similar issue where I have the conf set to bind address to local but still getting the error thanks
My mistake. I have been solving this problem by adding a new config:
spark.driver.host=127.0.0.1
Thank you Also setting this spark env variable also is working for me: SPARK_LOCAL_IP="127.0.0.1"
We try to submit jar on lambda, and modify the
spark_submit
function:Then get error logs: