Open vutrungduc7593 opened 5 years ago
How to reproduce this behaviour and what do you do?
I solved it by changing the command from
spark-master:
command:
- ./sbt
- runMain geotrellis.chatta.Main
to
spark-master:
command: master
Then I bound ./data directory to spark-worker's volumes
spark-worker:
volumes:
- "./data:/data"
Then I submitted to spark cluster with --master URL like
ingest-geodocker: server/${JAR}
docker-compose -f docker-compose.yml exec spark-master \
spark-submit \
--master spark://a8dfe940b6a5:7077\
--class chatta.ChattaIngest \
${JAR} \
--input "file:///server/conf/input.json" \
--output "file:///server/conf/output.json" \
--backend-profiles "file:///server/conf/backend-profiles.json"
Hello. I found this error in spark-worker container logs. Can you give me some advice to fix it? Thank you.