Open luigi-asprino opened 4 years ago
Is the port open? if you have portainer installed can use that to view all the containers running on the host and check the ports
Yes it is. I opened the following TCP ports 8020, 8042, 8088, 8188, 9000, 9864-9867, 9879, 9870, 50070
This is a screenshot from portrainer the 9000 and the 9870 are opened.
from the screenshot only 9000 and 9870 are open? For you docker-compose you need something like
ports:
- 16000:16000
- 16010:16010
- 16020:16020
for each port
Hi @Data-drone I think this helps, thanks! I've mapped all the ports declared in the Dockerfile (may something that should be done the bde repo as well?) but I still get the same error when I try to read a file with
./hadoop fs -cat hdfs://<remote_ip>:9000/a/a.txt
I'm not an expert but it seems to me that the address given to the datanode (i.e. d4d282b96803) is not reachable from the outside of the network created by docker. Anyone knows how to make datanode addresses with a given remote ip?
probably check this: https://community.cloudera.com/t5/Support-Questions/HDFS-port-8020-not-accessible-from-outside/td-p/201252 Also try to add an extra test node to your docker compose and see if that node can access it
Do you know how to solve it,please?I have the same problem.
Hi,
I have deployed the docker containers on a remote server. I'm able to list files stored on hdfs from a client application (e.g. ./hadoop fs -ls hdfs://:9000/) but when I try to read a file (e.g. ./hadoop fs -cat hdfs://:9000/a.txt) files I get a ConnectionRefused Exception.
In the hadoop.env file I've only set
I'm a newbie hadoop user but it seems to me that the datanode ip is not visible outside the network created by docker and I don't know how to fix it.
Anyone can help me?
Thanks
Luigi