rapidsai / spark-examples

[ARCHIVED] Moved to github.com/NVIDIA/spark-xgboost-examples
https://github.com/NVIDIA/spark-xgboost-examples
Apache License 2.0
70 stars 40 forks source link

Docker build failed #36

Closed xiaonans closed 5 years ago

xiaonans commented 5 years ago

Hi guys,

An error accurs when I run docker build -t rapidspark:v0.1 . under spark-examples directory:

Step 9/23 : COPY jars /opt/spark/jars COPY failed: stat /var/lib/docker/tmp/docker-builder621440454/jars: no such file or directory

Where should I find the jars dir?

jlowe commented 5 years ago

The jars directory comes from the Spark distribution. There's a comment in the Dockerfile about it:

# Before building the docker image, first build and make a Spark distribution following
# the instructions in http://spark.apache.org/docs/latest/building-spark.html.
# If this docker file is being used in the context of building your images from a Spark
# distribution, the docker build command should be invoked from the top level directory
# of the Spark distribution. E.g.:
# docker build -t spark:latest -f kubernetes/dockerfiles/spark/Dockerfile .

So instead of building it underneath the spark-examples directory, I believe the docker build needs to be invoked wherever your Spark distribution has been installed. That's where the build will be able to find the directories like jars, bin, sbin, etc. that are part of a Spark distribution layout.

cc: @rongou in case I missed any details.

xiaonans commented 5 years ago

I tried to build the docker underneath where my Spark was installed, with the command docker build -t rapidspark:v0.1 -f ~/projects/rapids/spark-examples/Dockerfile .. But I got another error:

Step 14/23 : COPY kubernetes/tests /opt/spark/tests COPY failed: stat /var/lib/docker/tmp/docker-builder481279651/kubernetes/tests: no such file or directory

My spark version is 2.3.4. And I was able to build the spark docker with the command docker build -t spark:latest -f kubernetes/dockerfiles/spark/Dockerfile . under the same dir.

xiaonans commented 5 years ago

Issue solved after swich to spark 2.4.3.