kiwenlau / hadoop-cluster-docker

Run Hadoop Custer within Docker Containers
Apache License 2.0
1.79k stars 855 forks source link

Multi-host cluster, Apache Spark Cluster and Production Ready #29

Open OElesin opened 8 years ago

OElesin commented 8 years ago

I tried this out on my local and it was fantastic, 5 node cluster. However, I want to also set up a spark cluster on the docker images as well as a multi-host cluster for high availability.

E.g have a 5 node cluster per physical host for 3 physical hosts and have them communicate with each other.

Also want to know if this image is production ready.

Thank you

kiwenlau commented 8 years ago
  1. Currently, this project does not support multiple physical nodes... I'm not sure how to implement it. The problem is Hadoop use SSH to commute, which will cause problem when containers run on different physical nodes.
  2. Currently, this project runs HDFS in container, which means the data will be deleted when the container is deleted. So, to use it in production, you need to put HDFS on host node using "docker volume". In addition, you need to run multiple master to ensure availability. If you solve mentioned problems. I think it is OK to run Hadoop in container for production. Of course, you need to run more tests before using it.
OElesin commented 8 years ago

Thanks @kiwenlau. For your first response, I think setting up docker swarm will work, however my team and I are yet to test this. Will update you when this is done.

We will also test docker volume and provide our feedback soon.

Thanks again

saikishor commented 6 years ago

Any updated guys?. I would like to use HDFS on multi-host cluster i.e., docker running on each host machine and share data among themselves...

saikishor commented 6 years ago

@kiwenlau I don't know why only with your docker code, the nodemanager in running in slaves. If i use your components of code in my docker file, it's not working(nodemanager in not running in slaves)...