geotrellis / geotrellis-chatta-demo

Demo of GeoTrellis - weighted overlay and zonal summary for University of Tennessee at Chattanooga.
50 stars 34 forks source link

Failed to connect to spark-master:7077 in spark-worker #54

Open vutrungduc7593 opened 5 years ago

vutrungduc7593 commented 5 years ago

Hello. I found this error in spark-worker container logs. Can you give me some advice to fix it? Thank you.

Caused by: java.io.IOException: Failed to connect to spark-master/172.22.0.8:7077
        at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:228)
        at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:179)
        at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:197)
        at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:191)
        at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:187)
        ... 4 more
Caused by: java.net.ConnectException: Connection refused: spark-master/172.22.0.8:7077
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
        at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:224)
        at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:289)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:528)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        ... 1 more
pomadchin commented 5 years ago

How to reproduce this behaviour and what do you do?

vutrungduc7593 commented 5 years ago

I solved it by changing the command from

spark-master:
        command:
            - ./sbt
            - runMain geotrellis.chatta.Main

to

spark-master:
        command: master

Then I bound ./data directory to spark-worker's volumes

spark-worker:
        volumes:
          - "./data:/data"

Then I submitted to spark cluster with --master URL like

ingest-geodocker: server/${JAR}
    docker-compose -f docker-compose.yml exec spark-master \
        spark-submit \
        --master spark://a8dfe940b6a5:7077\
        --class chatta.ChattaIngest \
        ${JAR} \
        --input "file:///server/conf/input.json" \
        --output "file:///server/conf/output.json" \
        --backend-profiles "file:///server/conf/backend-profiles.json"