microsoft / Mobius

C# and F# language binding and extensions to Apache Spark
MIT License
942 stars 213 forks source link

Streaming Error #599

Open nghiadinhhieu opened 7 years ago

nghiadinhhieu commented 7 years ago

Hello All, I run samples Pi, Wordcount is done correctly. But when i run Kafka Example, it show errors:

[ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.SparkContext when called with 1 parameters ([Index=1, Type=JvmObjectReference, Value=3], ) [2016-11-29 15:23:53,370] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - org.apache.spark.SparkException: Could not parse Master URL: '' at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createT askScheduler(SparkContext.scala:2735) at org.apache.spark.SparkContext.(SparkContext.scala:522) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

orAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.api.csharp.CSharpBackendHandler.handleMethodCall(CSh arpBackendHandler.scala:167) at org.apache.spark.api.csharp.CSharpBackendHandler.handleBackendRequest (CSharpBackendHandler.scala:103) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:30) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:27) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChanne lInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745) [2016-11-29 15:23:53,370] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.SparkContext when called with 1 parameters ([Index=1, Type=JvmObjectReference, Value=3], ) [2016-11-29 15:23:53,370] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] -



at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 90



Unhandled Exception: System.Exception: JVM method execution failed: Constructor failed for class org.apache.spark.SparkContext when called with 1 parameters ([I ndex=1, Type=JvmObjectReference, Value=3], ) at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 135 at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(String classN ame, Object[] parameters) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSh arp\Interop\Ipc\JvmBridge.cs:line 46 at Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateSparkContext(ISpar kConfProxy conf) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Proxy \Ipc\SparkCLRIpcProxy.cs:line 64 at Microsoft.Spark.CSharp.Core.SparkContext..ctor(String master, String appNa me, String sparkHome, SparkConf conf) in C:\Mobius-master\csharp\Adapter\Microso ft.Spark.CSharp\Core\SparkContext.cs:line 144 at Microsoft.Spark.CSharp.Core.SparkContext..ctor(String master, String appNa me) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Core\SparkContext. cs:line 112 at Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(String[] args) i n C:\Mobius-master\examples\Streaming\Kafka\Program.cs:line 23 [2016-11-29 15:23:53,493] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check begin : weakReferences.Count = 3, checkCount: 10 [2016-11-29 15:23:53,493] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check end : released 0 garbage, remain 3 alive, used 0 ms : r elease garbage used 0 ms, store alive used 0 ms Exception caught: An existing connection was forcibly closed by the remote host java.io.IOException: An existing connection was forcibly closed by the remote ho st[CSharpRunner.main] closing CSharpBackend

    at sun.nio.ch.SocketDispatcher.read0(Native Method)
    at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
    at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)

Requesting to close all call back sockets. at sun.nio.ch.IOUtil.read(IOUtil .java:192)

    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)

[CSharpRunner.main] Return CSharpBackend code -532462766 at io.netty.buff er.UnpooledUnsafeDirectByteBuf.setBytes(UnpooledUnsafeDirectByteBuf.java:447)

    at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)
    at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketCha

nnel.java:242) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)

Please help me.

skaarthik commented 7 years ago

The error message is Could not parse Master URL:. Are you passing a valid value for master config setting? Can you share that value?

nghiadinhhieu commented 7 years ago

Thanks for replying, I run Kafka Example with localmode. When i debug it encounters errror at function "public static StreamingContext GetOrCreate(string checkpointPath, Func creatingFunc)" ==> Error at code line: return new StreamingContext(SparkCLREnvironment.SparkCLRProxy.CreateStreamingContext(checkpointPath));

Here is the Call Stack for debug:

Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(bool isStatic, object classNameOrJvmObjectReference, string methodName, object[] parameters) Line 135 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(string className, object[] parameters) Line 46 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy.StreamingContextIpcProxy(string checkpointPath) Line 64 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateStreamingContext(string checkpointPath) Line 91 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(string checkpointPath, System.Func creatingFunc) Line 84 C#

Exception: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], )"}

The checkpointPath, i've already created in HDFS: const string checkpointPath = "hdfs://192.168.10.32:9000/checkpoint"

Here is my app.config:

<?xml version="1.0" encoding="utf-8"?>

I also run this example with Standalone Cluster mode " sparkclr-submit.cmd --master spark://192.168.10.24:7077 --conf spark.local.dir=c:\temp --exe SparkClrKafka.exe C:\Mobius-master\examples\Streaming\Kafka\bin\Debug"

, and it show errors:

[ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index= 1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], ) [2016-11-30 10:13:08,284] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - java.util.NoSuchElementException: None.get at scala.None$.get(Option.scala:313) at scala.None$.get(Option.scala:311) at org.apache.spark.streaming.StreamingContext.(StreamingContext.s cala:108) at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaS treamingContext.scala:146) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

orAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.api.csharp.CSharpBackendHandler.handleMethodCall(CSh arpBackendHandler.scala:167) at org.apache.spark.api.csharp.CSharpBackendHandler.handleBackendRequest (CSharpBackendHandler.scala:103) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:30) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:27) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChanne lInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)

[2016-11-30 10:13:08,284] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index= 1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], ) [2016-11-30 10:13:08,284] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] -



at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 90



Unhandled Exception: System.Exception: JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when c alled with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/ checkpoint], ) at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 135 at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(String classN ame, Object[] parameters) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSh arp\Interop\Ipc\JvmBridge.cs:line 46 at Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy..ctor(String che ckpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Proxy\Ipc \StreamingContextIpcProxy.cs:line 64 at Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateStreamingContext(S tring checkpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\ Proxy\Ipc\SparkCLRIpcProxy.cs:line 91 at Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(String check pointPath, Func`1 creatingFunc) in C:\Mobius-master\csharp\Adapter\Microsoft.Spa rk.CSharp\Streaming\StreamingContext.cs:line 84 at Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(String[] args) i n C:\Mobius-master\examples\Streaming\Kafka\Program.cs:line 39 [2016-11-30 10:13:12,468] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check begin : weakReferences.Count = 24, checkCount: 10 [2016-11-30 10:13:12,468] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - Stop releasing as exceeded allowed checkCount: 10 [2016-11-30 10:13:12,468] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check end : released 0 garbage, remain 24 alive, used 0 ms : release garbage used 0 ms, store alive used 0 ms Exception caught: An existing connection was forcibly closed by the remote host java.io.IOException: An existing connection was forcibly closed by the remote ho st at sun.nio.ch.SocketDispatcher.read0(Native Method) [CSharpRunner.main] closing CSharpBackend at sun.nio.ch.SocketDispatcher.r ead(SocketDispatcher.java:43)

    at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
    at sun.nio.ch.IOUtil.read(IOUtil.java:192)
    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)

Requesting to close all call back sockets. at io.netty.buffer.UnpooledUnsaf eDirectByteBuf.setBytes(UnpooledUnsafeDirectByteBuf.java:447)

[CSharpRunner.main] Return CSharpBackend code -532462766 at io.netty.buff er.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)

    at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketCha

nnel.java:242) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)

skaarthik commented 7 years ago

Which version of Mobius are you using? Do not manually create checkpoint directory. If the directory exists, Spark will try to load checkpoint from that directory. I guess that is failing for you.

If not creating checkpoint directory does not help, avoid using checkpoints until you figure out the root cause.

nghiadinhhieu commented 7 years ago

I am using the lastest Mobius version in https://github.com/Microsoft/Mobius. All Unit Test is Passed. But in runtime i encounter some errors in streaming context.

I try do not manually create checkpoint directory and I've got a new error: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper when called with no parameters"}

Here is the Call Stack:

Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(bool isStatic, object classNameOrJvmObjectReference, string methodName, object[] parameters) Line 135 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(string className, object[] parameters) Line 46 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy.DirectKafkaStream(System.Collections.Generic.List topics, System.Collections.Generic.Dictionary<string, string> kafkaParams, System.Collections.Generic.Dictionary<string, long> fromOffsets) Line 226 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Streaming.KafkaUtils.CreateDirectStream(Microsoft.Spark.CSharp.Streaming.StreamingContext ssc, System.Collections.Generic.List topics, System.Collections.Generic.Dictionary<string, string> kafkaParams, System.Collections.Generic.Dictionary<string, long> fromOffsets) Line 95 C# SparkClrKafka.exe!Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main.AnonymousMethod__0() Line 45 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(string checkpointPath, System.Func creatingFunc) Line 79 C# SparkClrKafka.exe!Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(string[] args) Line 39 C#

skaarthik commented 7 years ago

Can you share Spark's JVM logs? There is a failure in JVM that triggers Mobius failure. You should be able to do Web search and find info on how to set log4j settings for Spark.

nghiadinhhieu commented 7 years ago

This is my command line: sparkclr-submit.cmd --master local[*] --conf spark.local.dir=c:\temp --exe SparkClrKafka.exe C:\Mobius-master\examples\Streaming\Kafka\bin\Debug And here is my Spark's JVM logs:

spark.txt

skaarthik commented 7 years ago

Can you confirm if you updated code to set topic name and broker list? https://github.com/Microsoft/Mobius/blob/master/examples/Streaming/Kafka/Program.cs#L24 https://github.com/Microsoft/Mobius/blob/master/examples/Streaming/Kafka/Program.cs#L28

nghiadinhhieu commented 7 years ago

Yes, I've already set these parameters, here is my code:

var sparkContext = new SparkContext(new SparkConf().SetAppName("SparkCLRKafka Example")); const string topicName = "test"; var topicList = new List {topicName}; var kafkaParams = new Dictionary<string, string> //refer to http://kafka.apache.org/documentation.html#configuration { {"metadata.broker.list", "192.168.10.135:9092"}, {"auto.offset.reset", "smallest"} };

skaarthik commented 7 years ago

If you have set valid Kafka parameters, I cannot think of a reason for the failure. There are no exceptions in Spark logs you shared. So it is not clear what is happening. Setting the log level to DEBUG and see if you find anything useful.

nghiadinhhieu commented 7 years ago

Yes, in Spark log I can not see any errors, but when i run debug projects I've got 2 errors at 2 code lines:

  1. SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.api.java.JavaStreamingContext", new object[] { checkpointPath }); ==> even I let spark creates checkpointPath, it will encounters errors when it exists. Exception is: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], )"}

  2. JvmObjectReference jhelper = SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper", new object[] { }); Exception is: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper when called with no parameters"}

I really don't know why, maybe some of configs from you could figure out the problems.

I found spark logs DEBUG:

[CSharpBackend] DEBUG io.netty.util.internal.PlatformDependent - Javassist: unavailable [CSharpBackend] DEBUG io.netty.util.internal.PlatformDependent - You don't have Javassist in your class path or you don't have enough permission to load dynamically generated classes. Please check the configuration for better performance.

Did it can cause errors?

Thanks a lot.

skaarthik commented 7 years ago

Since you are able to run Pi or WordCount, I doubt that the failure has anything to do with CSharpBackend or the DEBUG messages copied above.

To keep the investigation simple, try to run your driver program using non-debug Mobius setup & Spark local mode (instead of using a remote Spark cluster) and share the entire output you see in the console.

nghiadinhhieu commented 7 years ago

Here is my console log at local mode:

... at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745) Caused by: java.util.NoSuchElementException: None.get at scala.None$.get(Option.scala:313) at scala.None$.get(Option.scala:311) at org.apache.spark.streaming.StreamingContext.(StreamingContext.s cala:108) at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaS treamingContext.scala:146) ... 26 more () methods: public void org.apache.spark.streaming.api.java.JavaStreamingContext.start() public void org.apache.spark.streaming.api.java.JavaStreamingContext.stop() public void org.apache.spark.streaming.api.java.JavaStreamingContext.stop(boolea n) public void org.apache.spark.streaming.api.java.JavaStreamingContext.stop(boolea n,boolean) public org.apache.spark.streaming.StreamingContextState org.apache.spark.streami ng.api.java.JavaStreamingContext.getState() public void org.apache.spark.streaming.api.java.JavaStreamingContext.close() public org.apache.spark.streaming.api.java.JavaPairDStream org.apache.spark.stre aming.api.java.JavaStreamingContext.union(org.apache.spark.streaming.api.java.Ja vaPairDStream,java.util.List) public org.apache.spark.streaming.api.java.JavaDStream org.apache.spark.streamin g.api.java.JavaStreamingContext.union(org.apache.spark.streaming.api.java.JavaDS tream,java.util.List) public org.apache.spark.api.java.JavaSparkContext org.apache.spark.streaming.api .java.JavaStreamingContext.sc() public org.apache.spark.streaming.api.java.JavaDStream org.apache.spark.streamin g.api.java.JavaStreamingContext.transform(java.util.List,org.apache.spark.api.ja va.function.Function2) public org.apache.spark.streaming.StreamingContext org.apache.spark.streaming.ap i.java.JavaStreamingContext.ssc() public void org.apache.spark.streaming.api.java.JavaStreamingContext.awaitTermin ation() public void org.apache.spark.streaming.api.java.JavaStreamingContext.awaitTermin ation(long) public void org.apache.spark.streaming.api.java.JavaStreamingContext.checkpoint( java.lang.String) public static java.lang.String[] org.apache.spark.streaming.api.java.JavaStreami ngContext.jarOfClass(java.lang.Class) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.hadoop.conf.Configuration,org.apache.spark.streaming.api.java.JavaStream ingContextFactory) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.spark.streaming.api.java.JavaStreamingContextFactory) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.hadoop.conf.Configuration,org.apache.spark.streaming.api.java.JavaStream ingContextFactory,boolean) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.spark.api.java.function.Function0) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.spark.api.java.function.Function0,org.apache.hadoop.conf.Configuration,b oolean) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.spark.api.java.function.Function0,org.apache.hadoop.conf.Configuration) public org.apache.spark.api.java.JavaSparkContext org.apache.spark.streaming.api .java.JavaStreamingContext.sparkContext() public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.socketStream(java.lang.String,int,o rg.apache.spark.api.java.function.Function,org.apache.spark.storage.StorageLevel ) public org.apache.spark.streaming.api.java.JavaDStream org.apache.spark.streamin g.api.java.JavaStreamingContext.binaryRecordsStream(java.lang.String,int) public void org.apache.spark.streaming.api.java.JavaStreamingContext.addStreamin gListener(org.apache.spark.streaming.scheduler.StreamingListener) public boolean org.apache.spark.streaming.api.java.JavaStreamingContext.awaitTer minationOrTimeout(long) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.socketTextStream(java.lang.String,i nt) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.socketTextStream(java.lang.String,i nt,org.apache.spark.storage.StorageLevel) public org.apache.spark.streaming.api.java.JavaDStream org.apache.spark.streamin g.api.java.JavaStreamingContext.textFileStream(java.lang.String) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.rawSocketStream(java.lang.String,in t,org.apache.spark.storage.StorageLevel) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.rawSocketStream(java.lang.String,in t) public org.apache.spark.streaming.api.java.JavaPairInputDStream org.apache.spark .streaming.api.java.JavaStreamingContext.fileStream(java.lang.String,java.lang.C lass,java.lang.Class,java.lang.Class,org.apache.spark.api.java.function.Function ,boolean,org.apache.hadoop.conf.Configuration) public org.apache.spark.streaming.api.java.JavaPairInputDStream org.apache.spark .streaming.api.java.JavaStreamingContext.fileStream(java.lang.String,java.lang.C lass,java.lang.Class,java.lang.Class,org.apache.spark.api.java.function.Function ,boolean) public org.apache.spark.streaming.api.java.JavaPairInputDStream org.apache.spark .streaming.api.java.JavaStreamingContext.fileStream(java.lang.String,java.lang.C lass,java.lang.Class,java.lang.Class) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.actorStream(akka.actor.Props,java.l ang.String) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.actorStream(akka.actor.Props,java.l ang.String,org.apache.spark.storage.StorageLevel) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.actorStream(akka.actor.Props,java.l ang.String,org.apache.spark.storage.StorageLevel,akka.actor.SupervisorStrategy) public org.apache.spark.streaming.api.java.JavaInputDStream org.apache.spark.str eaming.api.java.JavaStreamingContext.queueStream(java.util.Queue,boolean,org.apa che.spark.api.java.JavaRDD) public org.apache.spark.streaming.api.java.JavaInputDStream org.apache.spark.str eaming.api.java.JavaStreamingContext.queueStream(java.util.Queue,boolean) public org.apache.spark.streaming.api.java.JavaDStream org.apache.spark.streamin g.api.java.JavaStreamingContext.queueStream(java.util.Queue) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.receiverStream(org.apache.spark.str eaming.receiver.Receiver) public org.apache.spark.streaming.api.java.JavaPairDStream org.apache.spark.stre aming.api.java.JavaStreamingContext.transformToPair(java.util.List,org.apache.sp ark.api.java.function.Function2) public void org.apache.spark.streaming.api.java.JavaStreamingContext.remember(or g.apache.spark.streaming.Duration) public final void java.lang.Object.wait(long,int) throws java.lang.InterruptedEx ception public final native void java.lang.Object.wait(long) throws java.lang.Interrupte dException public final void java.lang.Object.wait() throws java.lang.InterruptedException public boolean java.lang.Object.equals(java.lang.Object) public java.lang.String java.lang.Object.toString() public native int java.lang.Object.hashCode() public final native java.lang.Class java.lang.Object.getClass() public final native void java.lang.Object.notify() public final native void java.lang.Object.notifyAll() args: argType: java.lang.String, argValue: hdfs://192.168.10.32:9000/checkpoint [2016-12-03 12:13:24,936] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index= 1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], ) [2016-12-03 12:13:24,936] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - java.util.NoSuchElementException: None.get at scala.None$.get(Option.scala:313) at scala.None$.get(Option.scala:311) at org.apache.spark.streaming.StreamingContext.(StreamingContext.s cala:108) at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaS treamingContext.scala:146) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

orAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.api.csharp.CSharpBackendHandler.handleMethodCall(CSh arpBackendHandler.scala:167) at org.apache.spark.api.csharp.CSharpBackendHandler.handleBackendRequest (CSharpBackendHandler.scala:103) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:30) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:27) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChanne lInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)

[2016-12-03 12:13:24,936] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index= 1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], ) [2016-12-03 12:13:24,951] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] -



at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 93



Unhandled Exception: System.Exception: JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when c alled with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/ checkpoint], ) at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 135 at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(String classN ame, Object[] parameters) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSh arp\Interop\Ipc\JvmBridge.cs:line 46 at Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy..ctor(String che ckpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Proxy\Ipc \StreamingContextIpcProxy.cs:line 64 at Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateStreamingContext(S tring checkpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\ Proxy\Ipc\SparkCLRIpcProxy.cs:line 91 at Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(String check pointPath, Func`1 creatingFunc) in C:\Mobius-master\csharp\Adapter\Microsoft.Spa rk.CSharp\Streaming\StreamingContext.cs:line 84 at Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(String[] args) i n C:\Mobius-master\examples\Streaming\Kafka\Program.cs:line 39 .Exception caught: An existing connection was forcibly closed by the remote host

java.io.IOException: An existing connection was forcibly closed by the remote ho st at sun.nio.ch.SocketDispatcher.read0(Native Method) at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43) [CSharpRunner.main] closing CSharpBackend at sun.nio.ch.IOUtil.readIntoNat iveBuffer(IOUtil.java:223)

    at sun.nio.ch.IOUtil.read(IOUtil.java:192)
    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
    at io.netty.buffer.UnpooledUnsafeDirectByteBuf.setBytes(UnpooledUnsafeDi

rectByteBuf.java:447)Requesting to close all call back sockets.

    at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)

[CSharpRunner.main] Return CSharpBackend code -532462766 at io.netty.chan nel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)

    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra

ctNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)

skaarthik commented 7 years ago

Looks like something went wrong when reading checkpoint path. Have you ruled out any issue in HADOOP_HOME or winutils.exe by using HDFS paths in word count or [word count streaming](https://github.com/Microsoft/Mobius/blob/master/notes/running-mobius-app.md#hdfswordcount-example-streaming examples)?

nghiadinhhieu commented 7 years ago

I tested my hadoop configs work well. I think my issues relating to the calling JVM StreamingContext functions like this: "SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.api.java.JavaStreamingContext" and the function: SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper", new object[] { }); I can not figure out the problems here, maybe in my environments setup. Thanks for helping me.

skaarthik commented 7 years ago

Unfortunately I am not able to reproduce this error. I will try out a few things during the holidays to see if I can get the same error for further investigation.

skaarthik commented 7 years ago

@nghiadinhhieu - are you still having this issue?

nghiadinhhieu commented 7 years ago

Yeah, I upgraded version 2 release, but it's still issue above. Thanks

Mrsevic commented 7 years ago

Hi @nghiadinhhieu have you had any luck in solving this issue?