Closed MBtech closed 4 years ago
Hi, looks familiar to what I've seen before. I'll try it out on my end and let you know on the progress.
Thanks. I did get around the problem by simply providing the python dependencies using the pyfile argument in the batch create for livy but it would be useful to know why it happens in this particular case, just for the sake of knowledge.
Yo, was just struggling with this issue.
Basically, you modified the wrong image (it's livy
) with wrong entrypoint.sh
that just runs livy server
. You need to modify image that will be used by driver/executors
It breaks here
as the bash
script wants to read from /opt/spark/conf
which is mounted by k8s and is in essence dir with spark.properties
BR, M
Indeed, thx @maciekdude . @MBtech, please use:
sasnouskikh/livy:<version>
- to run Livy server containerssasnouskikh/livy-spark:<version>
- to run Spark driver and executor containers with Livy support
It might be a naive question but I have modified the livy docker image to create a new one using the following
and I replaced the image repository in the
values.yaml
of the livy chart as well as the value forLIVY_SPARK_KUBERNETES_CONTAINER_IMAGE
.But now if I run the example pi spark example I get the following error in the job logs:
Any idea why? Everything ran just fine with the original image.