Closed KamalGalrani closed 4 years ago
I believe this relates to the K8s API version incompatibility with the K8s Client. Which K8s API version you use with the Minikube?
singularity@Kamal-Omen:~$ minikube version
minikube version: v1.13.0
commit: 0c5e9de4ca6f9c55147ae7f90af97eff5befef5f-dirty
singularity@Kamal-Omen:~$ kubectl version
Client Version: version.Info{Major:"1", Minor:"17+", GitVersion:"v1.17.7-eks-bffbac", GitCommit:"bffbacfd13a805a12d10ccc0ca26205ae1ca76e9", GitTreeState:"clean", BuildDate:"2020-07-08T18:30:00Z", GoVersion:"go1.13.9", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.0", GitCommit:"e19964183377d0ec2052d1f1fa930c4d7575bd50", GitTreeState:"clean", BuildDate:"2020-08-26T14:23:04Z", GoVersion:"go1.15", Compiler:"gc", Platform:"linux/amd64"}
Does this give the K8s API version
Also, in the other thread you mentioned about driver logs and networking issue. Could you please help me with extracting driver log if the above doesn't solve the issue. And can you elaborate what do you mean by networking issue?
First about this one: current Helm chart runs Spark 2.4.5 and Livy with fabric8 Java K8s client 4.6.1 which is limited by K8s API 1.15.3. Your K8s API version on Minikube is Major:"1", Minor:"19"
or 1.19. To run the desired K8s API version with Minikube you can execute: minikube start --kubernetes-version=1.15.0 ...
.
I'm currently working on Spark 3 support which is going to ease this limitation.
Second: to extract Spark Driver logs you can execute kubectl logs <spark-driver-pod-name>
. By networking issue I mean that there can be non-stable network, which occasionally may fail requests within it. Network is backed by the hardware and software which can work with problems. Depending on your environment there can be more or less of such issues. Also I cannot say exactly if it is so, it's just the assumption to check with the network engineers and try to trace the requests if you have some APM Metrics and request tracing setup in the cluster.
Ahh, I see, then let's wait for the logs... Hope you get it solved!
Thanks Downgrading K8s worked
I was able to setup Livy using the helm chart, but when I create a session it fails. I am using the default configuration with minikube
Create session payload