JahstreetOrg / spark-on-kubernetes-helm

Spark on Kubernetes infrastructure Helm charts repo
Apache License 2.0
199 stars 76 forks source link

Error when running with a modified docker image #44

Closed MBtech closed 4 years ago

MBtech commented 4 years ago

It might be a naive question but I have modified the livy docker image to create a new one using the following

from sasnouskikh/livy:0.8.0-incubating-spark_3.0.1_2.12-hadoop_3.2.0_cloud
RUN python3 -m pip install avro 

and I replaced the image repository in the values.yaml of the livy chart as well as the value for LIVY_SPARK_KUBERNETES_CONTAINER_IMAGE.

But now if I run the example pi spark example I get the following error in the job logs:

/opt/entrypoint.sh: line 45: /opt/spark/conf/spark-defaults.conf: Read-only file system

Any idea why? Everything ran just fine with the original image.

jahstreet commented 4 years ago

Hi, looks familiar to what I've seen before. I'll try it out on my end and let you know on the progress.

MBtech commented 4 years ago

Thanks. I did get around the problem by simply providing the python dependencies using the pyfile argument in the batch create for livy but it would be useful to know why it happens in this particular case, just for the sake of knowledge.

maciekdude commented 4 years ago

Yo, was just struggling with this issue. Basically, you modified the wrong image (it's livy) with wrong entrypoint.sh that just runs livy server. You need to modify image that will be used by driver/executors It breaks here

as the bash script wants to read from /opt/spark/conf which is mounted by k8s and is in essence dir with spark.properties

BR, M

jahstreet commented 4 years ago

Indeed, thx @maciekdude . @MBtech, please use: