randerzander / docker-hdp

Dockerized HDP Cluster
84 stars 53 forks source link

Docker Swarm support #9

Open sarath-mec opened 7 years ago

sarath-mec commented 7 years ago

Hi,

I was able to bring this up in two node cluster using Swarm 1.13. Just wanted to share how it was done for all

First create a swarm by

docker swarm init --advertise-addr 10.1.0.4

And then use the command provided to join the other hosts

git clone https://github.com/sarath-mec/smks-docker-hdp.git

We used a combination of a) Docker attachable Overlay Network for swarm b) docker stack deploy for psql and ambari-server c) docker-compose for running individual nodes

Used this approach as privileged for nodes is not supported for docker swarm services But we can create an overlay attachable network and run the containers into the swarm network

All containers are discoverable by hostname by using this approach


$ HDP Swarm using stack deploy and docker-compose using external overlay network

both manager and worker hosts to pull images

docker pull sarath4mec/node docker pull sarath4mec/ambari-server docker pull sarath4mec/postgres

create volume directories in both nodes

sudo su cd /home/sarath_mec rm -rd hdfs rm -rd postgresql mkdir hdfs mkdir postgresql mkdir hdfs/dn0 mkdir hdfs/dn1 mkdir postgresql/data chmod 777 hdfs chmod 777 hdfs/dn0 chmod 777 hdfs/dn1 chmod 777 postgresql/data cd smks-docker-hdp

create network

docker network create -d overlay --subnet 10.0.9.0/24 --attachable hdfs

stack deploy postgres and ambari in both manager and worker host

docker stack deploy --compose-file examples/compose/swarm-psql-ambari-container.yml hdfs

inspect docker network from both nodes

docker network inspect hdfs

see the network and virtual ip assigned from manager node

docker service inspect --format='{{json .Endpoint.VirtualIPs}}' hdfs_postgres docker service inspect --format='{{json .Endpoint.VirtualIPs}}' hdfs_ambari-server

stack deploy individually psql and ambari

docker stack deploy --compose-file examples/compose/swarm-psql-container.yml hdfs docker stack deploy --compose-file examples/compose/swarm-ambari-container.yml hdfs

docker-compose and bring up dn0.dev in swarm manager host

docker-compose -f examples/compose/swarm-dn0-container.yml up -d

docker-compose and bring up dn0.dev in swarm worker worker host

docker-compose -f examples/compose/swarm-dn1-container.yml up -d

docker-compose and bring up dn0.dev in swarm manager host

docker-compose -f examples/compose/swarm-dn0-container-resume.yml up -d

docker-compose and bring up dn0.dev in swarm worker worker host

docker-compose -f examples/compose/swarm-dn1-container-resume.yml up -d

Clear everything

docker stack rm hdfs

manager host

docker stop compose_dn0_1 docker rm compose_dn0_1

worker host

docker stop compose_dn1_1 docker rm compose_dn1_1

docker network rm hdfs

Thanks, Sarath

devopsacid commented 7 years ago

Hi Sarah, does your installation works now? I have problem with blueprint deployment as components are in PENDING HOST ASSIGNMENT state.