Apache Spark enhanced with native Kubernetes scheduler back-end: NOTE this repository is being ARCHIVED as all new development for the kubernetes scheduler back-end is now on https://github.com/apache/spark/
with <address-of-any-cluster-node> I tried :
<ip-of-my-cluster> which is the Kubernetes API endpoint
the NodePort IP 10.3.250.165 , the
IP of the pod 10.0.0.X all of these fail.
the private IP (Primary internal IP) and (External IP) of the node VM on GCE
The documentation states:
<address-of-any-cluster-node> what should this be? Or is it some other issue with discovery/setup/kubernetes version?
Exception in thread "main" java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at okhttp3.internal.platform.Platform.connectSocket(Platform.java:124)
at okhttp3.internal.connection.RealConnection.connectSocket(RealConnection.java:223)
at okhttp3.internal.connection.RealConnection.connect(RealConnection.java:149)
at okhttp3.internal.connection.StreamAllocation.findConnection(StreamAllocation.java:195)
at okhttp3.internal.connection.StreamAllocation.findHealthyConnection(StreamAllocation.java:121)
at okhttp3.internal.connection.StreamAllocation.newStream(StreamAllocation.java:100)
at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.java:42)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:92)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:67)
at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.java:93)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:92)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:67)
at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.java:93)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:92)
at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.java:120)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:92)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:67)
at okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:185)
at okhttp3.RealCall.execute(RealCall.java:69)
at retrofit2.OkHttpCall.execute(OkHttpCall.java:174)
at org.apache.spark.deploy.k8s.submit.SubmittedDependencyUploaderImpl.getTypedResponseResult(SubmittedDependencyUploaderImpl.scala:101)
at org.apache.spark.deploy.k8s.submit.SubmittedDependencyUploaderImpl.doUpload(SubmittedDependencyUploaderImpl.scala:97)
at org.apache.spark.deploy.k8s.submit.SubmittedDependencyUploaderImpl.uploadJars(SubmittedDependencyUploaderImpl.scala:70)
at org.apache.spark.deploy.k8s.submit.submitsteps.initcontainer.SubmittedResourcesInitContainerConfigurationStep.configureInitContainer(SubmittedResourcesInitContainerConfigurationStep.scala:48)
at org.apache.spark.deploy.k8s.submit.submitsteps.InitContainerBootstrapStep$$anonfun$configureDriver$1.apply(InitContainerBootstrapStep.scala:43)
at org.apache.spark.deploy.k8s.submit.submitsteps.InitContainerBootstrapStep$$anonfun$configureDriver$1.apply(InitContainerBootstrapStep.scala:42)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.deploy.k8s.submit.submitsteps.InitContainerBootstrapStep.configureDriver(InitContainerBootstrapStep.scala:42)
at org.apache.spark.deploy.k8s.submit.Client$$anonfun$run$1.apply(Client.scala:95)
at org.apache.spark.deploy.k8s.submit.Client$$anonfun$run$1.apply(Client.scala:94)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.deploy.k8s.submit.Client.run(Client.scala:94)
at org.apache.spark.deploy.k8s.submit.Client$$anonfun$run$5.apply(Client.scala:191)
at org.apache.spark.deploy.k8s.submit.Client$$anonfun$run$5.apply(Client.scala:184)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2551)
at org.apache.spark.deploy.k8s.submit.Client$.run(Client.scala:184)
at org.apache.spark.deploy.k8s.submit.Client$.main(Client.scala:204)
at org.apache.spark.deploy.k8s.submit.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:786)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I am following the guide here:
https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html
on GCE Kubernetes with
Kubectl
Master version
1.9.2-gke.1
I have the RSS up and running
I can get the SparkPi application to work but when I try to use the resource staging server with local dependencies:
with
<address-of-any-cluster-node>
I tried :<ip-of-my-cluster>
which is the Kubernetes API endpoint theNodePort IP 10.3.250.165
, the IP of the pod10.0.0.X
all of these fail. the private IP (Primary internal IP) and (External IP) of the node VM on GCEThe documentation states:
<address-of-any-cluster-node>
what should this be? Or is it some other issue with discovery/setup/kubernetes version?