Closed rakeshramakrishnan closed 2 years ago
Right. Thank you for catching. In the meantime you can patch it locally and create a fix PR. I can get to it in the coming days/weeks and push the fix otherwise.
Added a PR here
@jahstreet: Is there a reason to limit the kubernetes version to 1.18.x? My docker desktop has a kubernetes version 1.19 and I'm wondering if or what is the reason behind this limit. Furthermore, we plan to upgrade our cluster to 1.19 and thus it will become an issue there as well. I haven't found such a limitation for spark itself yet.
Thanks for your help!
Hi @andreas-eberle , indeed there is a reason. Please check this comment for the details. To play with the charts locally I would suggest you to install Minikube using which you can control the API version of the local Kubernetes cluster. Please refer the guide for additional details. Does that option solve you issue?
I think this is fixed in https://github.com/JahstreetOrg/spark-on-kubernetes-helm/pull/52 (2.0.1 chart version).
@jahstreet Currently the latest changes in your repo for customized livy along with the helm charts were working for me with k8 version 1.21 on EKS. But now we upgraded to 1.22 which broke the setup since, the API version for ingress has changed. Can you please provide guidance here on the patch I could do? In parallel I am also exploring on the required changes.
Closing this issue in favour of https://github.com/JahstreetOrg/spark-on-kubernetes-helm/issues/82 within which it will be resolved, lets continue there.
And thanks for the patience 🙏 .
Tried installing the latest helm charts (commit hash)
Got the following error:
Can see that kubeVersion requirements have been upgraded for spark-cluster. Can we upgrade the same for Livy too?