big-data-europe / docker-spark

Apache Spark docker image
2.04k stars 698 forks source link

do not start worker #135

Open 0neday opened 3 years ago

0neday commented 3 years ago

show

bash-5.0# ./start-worker.sh    spark://spark:7077 
rsync from spark://spark:7077
/spark/sbin/spark-daemon.sh: line 177: rsync: command not found
starting org.apache.spark.deploy.worker.Worker, logging to /spark/logs/spark--org.apache.spark.deploy.worker.Worker-1-6f7782b9b0d5.out
ps: unrecognized option: p
BusyBox v1.30.1 (2020-05-30 09:44:53 UTC) multi-call binary.

Usage: ps [-o COL1,COL2=HEADER]

Show list of processes

        -o COL1,COL2=HEADER     Select columns for display
ps: unrecognized option: p
BusyBox v1.30.1 (2020-05-30 09:44:53 UTC) multi-call binary.

Usage: ps [-o COL1,COL2=HEADER]

Show list of processes
GezimSejdiu commented 3 years ago

Hi @0neday ,

um, could you maybe tell me a bit more about how you are trying to start a worker? From which image? If you have set it up via our example docker-compose file then the worker is automatically started or even via a normal docker-run using our worker image.

Do let me know a bit more so that we can resolve this.

Best regards,