Open nghiadinhhieu opened 7 years ago
The error message is Could not parse Master URL:
. Are you passing a valid value for master
config setting? Can you share that value?
Thanks for replying,
I run Kafka Example with localmode. When i debug it encounters errror at function
"public static StreamingContext GetOrCreate(string checkpointPath, Func
Here is the Call Stack for debug:
Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(bool isStatic, object classNameOrJvmObjectReference, string methodName, object[] parameters) Line 135 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(string className, object[] parameters) Line 46 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy.StreamingContextIpcProxy(string checkpointPath) Line 64 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateStreamingContext(string checkpointPath) Line 91 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(string checkpointPath, System.Func
creatingFunc) Line 84 C#
Exception: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], )"}
The checkpointPath, i've already created in HDFS: const string checkpointPath = "hdfs://192.168.10.32:9000/checkpoint"
Here is my app.config:
<?xml version="1.0" encoding="utf-8"?>
I also run this example with Standalone Cluster mode " sparkclr-submit.cmd --master spark://192.168.10.24:7077 --conf spark.local.dir=c:\temp --exe SparkClrKafka.exe C:\Mobius-master\examples\Streaming\Kafka\bin\Debug"
, and it show errors:
[ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri
dge] - JVM method execution failed: Constructor failed for class org.apache.spar
k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index=
1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], )
[2016-11-30 10:13:08,284] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri
dge] - java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:313)
at scala.None$.get(Option.scala:311)
at org.apache.spark.streaming.StreamingContext.
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
orAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.api.csharp.CSharpBackendHandler.handleMethodCall(CSh arpBackendHandler.scala:167) at org.apache.spark.api.csharp.CSharpBackendHandler.handleBackendRequest (CSharpBackendHandler.scala:103) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:30) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:27) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChanne lInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)
[2016-11-30 10:13:08,284] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index= 1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], ) [2016-11-30 10:13:08,284] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] -
at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 90
Unhandled Exception: System.Exception: JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when c alled with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/ checkpoint], ) at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 135 at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(String classN ame, Object[] parameters) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSh arp\Interop\Ipc\JvmBridge.cs:line 46 at Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy..ctor(String che ckpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Proxy\Ipc \StreamingContextIpcProxy.cs:line 64 at Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateStreamingContext(S tring checkpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\ Proxy\Ipc\SparkCLRIpcProxy.cs:line 91 at Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(String check pointPath, Func`1 creatingFunc) in C:\Mobius-master\csharp\Adapter\Microsoft.Spa rk.CSharp\Streaming\StreamingContext.cs:line 84 at Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(String[] args) i n C:\Mobius-master\examples\Streaming\Kafka\Program.cs:line 39 [2016-11-30 10:13:12,468] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check begin : weakReferences.Count = 24, checkCount: 10 [2016-11-30 10:13:12,468] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - Stop releasing as exceeded allowed checkCount: 10 [2016-11-30 10:13:12,468] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check end : released 0 garbage, remain 24 alive, used 0 ms : release garbage used 0 ms, store alive used 0 ms Exception caught: An existing connection was forcibly closed by the remote host java.io.IOException: An existing connection was forcibly closed by the remote ho st at sun.nio.ch.SocketDispatcher.read0(Native Method) [CSharpRunner.main] closing CSharpBackend at sun.nio.ch.SocketDispatcher.r ead(SocketDispatcher.java:43)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
Requesting to close all call back sockets. at io.netty.buffer.UnpooledUnsaf eDirectByteBuf.setBytes(UnpooledUnsafeDirectByteBuf.java:447)
[CSharpRunner.main] Return CSharpBackend code -532462766 at io.netty.buff er.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)
at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketCha
nnel.java:242) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)
Which version of Mobius are you using? Do not manually create checkpoint directory. If the directory exists, Spark will try to load checkpoint from that directory. I guess that is failing for you.
If not creating checkpoint directory does not help, avoid using checkpoints until you figure out the root cause.
I am using the lastest Mobius version in https://github.com/Microsoft/Mobius. All Unit Test is Passed. But in runtime i encounter some errors in streaming context.
I try do not manually create checkpoint directory and I've got a new error: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper when called with no parameters"}
Here is the Call Stack:
Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(bool isStatic, object classNameOrJvmObjectReference, string methodName, object[] parameters) Line 135 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(string className, object[] parameters) Line 46 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy.DirectKafkaStream(System.Collections.Generic.List
topics, System.Collections.Generic.Dictionary<string, string> kafkaParams, System.Collections.Generic.Dictionary<string, long> fromOffsets) Line 226 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Streaming.KafkaUtils.CreateDirectStream(Microsoft.Spark.CSharp.Streaming.StreamingContext ssc, System.Collections.Generic.List topics, System.Collections.Generic.Dictionary<string, string> kafkaParams, System.Collections.Generic.Dictionary<string, long> fromOffsets) Line 95 C# SparkClrKafka.exe!Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main.AnonymousMethod__0() Line 45 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(string checkpointPath, System.Func creatingFunc) Line 79 C# SparkClrKafka.exe!Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(string[] args) Line 39 C#
Can you share Spark's JVM logs? There is a failure in JVM that triggers Mobius failure. You should be able to do Web search and find info on how to set log4j settings for Spark.
This is my command line: sparkclr-submit.cmd --master local[*] --conf spark.local.dir=c:\temp --exe SparkClrKafka.exe C:\Mobius-master\examples\Streaming\Kafka\bin\Debug And here is my Spark's JVM logs:
Can you confirm if you updated code to set topic name and broker list? https://github.com/Microsoft/Mobius/blob/master/examples/Streaming/Kafka/Program.cs#L24 https://github.com/Microsoft/Mobius/blob/master/examples/Streaming/Kafka/Program.cs#L28
Yes, I've already set these parameters, here is my code:
var sparkContext = new SparkContext(new SparkConf().SetAppName("SparkCLRKafka Example"));
const string topicName = "test";
var topicList = new List
If you have set valid Kafka parameters, I cannot think of a reason for the failure. There are no exceptions in Spark logs you shared. So it is not clear what is happening. Setting the log level to DEBUG and see if you find anything useful.
Yes, in Spark log I can not see any errors, but when i run debug projects I've got 2 errors at 2 code lines:
SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.api.java.JavaStreamingContext", new object[] { checkpointPath }); ==> even I let spark creates checkpointPath, it will encounters errors when it exists. Exception is: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], )"}
JvmObjectReference jhelper = SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper", new object[] { }); Exception is: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper when called with no parameters"}
I really don't know why, maybe some of configs from you could figure out the problems.
I found spark logs DEBUG:
[CSharpBackend] DEBUG io.netty.util.internal.PlatformDependent - Javassist: unavailable [CSharpBackend] DEBUG io.netty.util.internal.PlatformDependent - You don't have Javassist in your class path or you don't have enough permission to load dynamically generated classes. Please check the configuration for better performance.
Did it can cause errors?
Thanks a lot.
Since you are able to run Pi or WordCount, I doubt that the failure has anything to do with CSharpBackend or the DEBUG messages copied above.
To keep the investigation simple, try to run your driver program using non-debug Mobius setup & Spark local mode (instead of using a remote Spark cluster) and share the entire output you see in the console.
Here is my console log at local mode:
... at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst
ractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra
ctChannelHandlerContext.java:294)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM
essageDecoder.java:103)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst
ractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra
ctChannelHandlerContext.java:294)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage
Decoder.java:244)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst
ractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra
ctChannelHandlerContext.java:294)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne
lPipeline.java:846)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra
ctNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav
a:511)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve
ntLoop.java:468)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja
va:382)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread
EventExecutor.java:111)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato
r.run(DefaultThreadFactory.java:137)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:313)
at scala.None$.get(Option.scala:311)
at org.apache.spark.streaming.StreamingContext.
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
orAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.api.csharp.CSharpBackendHandler.handleMethodCall(CSh arpBackendHandler.scala:167) at org.apache.spark.api.csharp.CSharpBackendHandler.handleBackendRequest (CSharpBackendHandler.scala:103) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:30) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:27) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChanne lInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)
[2016-12-03 12:13:24,936] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index= 1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], ) [2016-12-03 12:13:24,951] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] -
at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 93
Unhandled Exception: System.Exception: JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when c alled with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/ checkpoint], ) at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 135 at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(String classN ame, Object[] parameters) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSh arp\Interop\Ipc\JvmBridge.cs:line 46 at Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy..ctor(String che ckpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Proxy\Ipc \StreamingContextIpcProxy.cs:line 64 at Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateStreamingContext(S tring checkpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\ Proxy\Ipc\SparkCLRIpcProxy.cs:line 91 at Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(String check pointPath, Func`1 creatingFunc) in C:\Mobius-master\csharp\Adapter\Microsoft.Spa rk.CSharp\Streaming\StreamingContext.cs:line 84 at Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(String[] args) i n C:\Mobius-master\examples\Streaming\Kafka\Program.cs:line 39 .Exception caught: An existing connection was forcibly closed by the remote host
java.io.IOException: An existing connection was forcibly closed by the remote ho st at sun.nio.ch.SocketDispatcher.read0(Native Method) at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43) [CSharpRunner.main] closing CSharpBackend at sun.nio.ch.IOUtil.readIntoNat iveBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at io.netty.buffer.UnpooledUnsafeDirectByteBuf.setBytes(UnpooledUnsafeDi
rectByteBuf.java:447)Requesting to close all call back sockets.
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)
[CSharpRunner.main] Return CSharpBackend code -532462766 at io.netty.chan nel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra
ctNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)
Looks like something went wrong when reading checkpoint path. Have you ruled out any issue in HADOOP_HOME or winutils.exe by using HDFS paths in word count or [word count streaming](https://github.com/Microsoft/Mobius/blob/master/notes/running-mobius-app.md#hdfswordcount-example-streaming examples)?
I tested my hadoop configs work well. I think my issues relating to the calling JVM StreamingContext functions like this: "SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.api.java.JavaStreamingContext" and the function: SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper", new object[] { }); I can not figure out the problems here, maybe in my environments setup. Thanks for helping me.
Unfortunately I am not able to reproduce this error. I will try out a few things during the holidays to see if I can get the same error for further investigation.
@nghiadinhhieu - are you still having this issue?
Yeah, I upgraded version 2 release, but it's still issue above. Thanks
Hi @nghiadinhhieu have you had any luck in solving this issue?
Hello All, I run samples Pi, Wordcount is done correctly. But when i run Kafka Example, it show errors:
[ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.SparkContext when called with 1 parameters ([Index=1, Type=JvmObjectReference, Value=3], ) [2016-11-29 15:23:53,370] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - org.apache.spark.SparkException: Could not parse Master URL: '' at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createT askScheduler(SparkContext.scala:2735) at org.apache.spark.SparkContext.(SparkContext.scala:522)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
orAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.api.csharp.CSharpBackendHandler.handleMethodCall(CSh arpBackendHandler.scala:167) at org.apache.spark.api.csharp.CSharpBackendHandler.handleBackendRequest (CSharpBackendHandler.scala:103) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:30) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:27) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChanne lInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745) [2016-11-29 15:23:53,370] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.SparkContext when called with 1 parameters ([Index=1, Type=JvmObjectReference, Value=3], ) [2016-11-29 15:23:53,370] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] -
at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 90
Unhandled Exception: System.Exception: JVM method execution failed: Constructor failed for class org.apache.spark.SparkContext when called with 1 parameters ([I ndex=1, Type=JvmObjectReference, Value=3], ) at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 135 at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(String classN ame, Object[] parameters) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSh arp\Interop\Ipc\JvmBridge.cs:line 46 at Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateSparkContext(ISpar kConfProxy conf) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Proxy \Ipc\SparkCLRIpcProxy.cs:line 64 at Microsoft.Spark.CSharp.Core.SparkContext..ctor(String master, String appNa me, String sparkHome, SparkConf conf) in C:\Mobius-master\csharp\Adapter\Microso ft.Spark.CSharp\Core\SparkContext.cs:line 144 at Microsoft.Spark.CSharp.Core.SparkContext..ctor(String master, String appNa me) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Core\SparkContext. cs:line 112 at Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(String[] args) i n C:\Mobius-master\examples\Streaming\Kafka\Program.cs:line 23 [2016-11-29 15:23:53,493] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check begin : weakReferences.Count = 3, checkCount: 10 [2016-11-29 15:23:53,493] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check end : released 0 garbage, remain 3 alive, used 0 ms : r elease garbage used 0 ms, store alive used 0 ms Exception caught: An existing connection was forcibly closed by the remote host java.io.IOException: An existing connection was forcibly closed by the remote ho st[CSharpRunner.main] closing CSharpBackend
Requesting to close all call back sockets. at sun.nio.ch.IOUtil.read(IOUtil .java:192)
[CSharpRunner.main] Return CSharpBackend code -532462766 at io.netty.buff er.UnpooledUnsafeDirectByteBuf.setBytes(UnpooledUnsafeDirectByteBuf.java:447)
nnel.java:242) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)
Please help me.