Closed sragrawal1 closed 5 years ago
Hey, Have no idea where you have found this :) But if you found it from Docker Hub --> then i'd want you to follow the link below https://github.com/gettyimages/docker-spark
That's the master branch of Spark Standalone Cluster. All works from the box
And the answer to your question --> They are linked by the string "links" in the docker-compose file Also it can be done through "depends_on" in compose file!. All links between nodes have to be declared inside docker-compose file
So now we have links between our master and slave node :) Be glad to use
Hope I've answered to your question if not - say
Have a nice day
On Wed, Nov 8, 2017 at 6:08 PM, sragrawal1 notifications@github.com wrote:
I am running standalone spark cluster in container using gettyimages/docker-spark. I have compose file where , 1 service for Master (1 Replica) & 1 service for Worker (3 Replicas)
version: '3' services: testShell: image : gettyimages/spark
command: echo "Hello WOrld" master: image: gettyimages/spark command: bin/spark-class org.apache.spark.deploy.master.Master -h master hostname: master environment: MASTER: spark://master:7077 SPARK_CONF_DIR: /conf SPARK_PUBLIC_DNS: localhost expose:
- 7001
- 7002
- 7003
- 7004
- 7005
- 7006
- 7077
- 6066 ports:
- 4040:4040
- 6066:6066
- 7077:7077
- 8080:8080 deploy: placement: constraints:
- node.role == manager volumes:
- ./conf/master:/conf worker: image: gettyimages/spark image: gettyimages/spark:2.1.0-hadoop-2.7
command: bin/spark-class org.apache.spark.deploy.worker.Worker spark://master:7077 hostname: worker environment: SPARK_CONF_DIR: /conf SPARK_WORKER_CORES: 2 SPARK_WORKER_MEMORY: 2g SPARK_WORKER_PORT: 8881 SPARK_WORKER_WEBUI_PORT: 8081 SPARK_PUBLIC_DNS: localhost
links: - master
expose:
- 7012
- 7013
- 7014
- 7015
- 7016
- 8881 ports:
- 8081:8081 deploy: replicas: 3 depends_on:
- master
My questions is How Spark WOrker containers attached to Spark Master ? i dont see any entries in containers {SparkFolder}/conf/ slaves or spark-env.sh.
Could you please help me to find ?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/gettyimages/docker-spark/issues/33, or mute the thread https://github.com/notifications/unsubscribe-auth/AWE7MxtbD7CAJ_VRMZYORYuI71ABQKh2ks5s0dIIgaJpZM4QWn9Q .
Actually Its not an issue.. but i need some info regarding its behaviour !!
Here is My Scenario.
I am running standalone spark cluster in container using gettyimages/docker-spark. I have compose file where , 1 service for Master (1 Replica) & 1 service for Worker (3 Replicas)
version: '3' services: command: echo "Hello WOrld" master: image: gettyimages/spark command: bin/spark-class org.apache.spark.deploy.master.Master -h master hostname: master environment: MASTER: spark://master:7077 SPARK_CONF_DIR: /conf SPARK_PUBLIC_DNS: localhost expose:
expose:
My questions is How Spark WOrker containers attached to Spark Master ? i dont see any entries in containers {SparkFolder}/conf/ slaves or spark-env.sh.
Could you please help me to find ?