Closed sorabh89 closed 9 years ago
Sorry for replying late. You still have this issue ? Have you specified the Kafka related properties correctly for Receivers to consume from Kafka cluster ? Can you share the properties you specified for the same in your driver code...
Thanks for your reply,
I have resolved this issue by setting value for master as local[*]. But I am facing one more issue that is related to 'akka.version'. I know that this issue is because of 'reference.conf' file, but I am not able to find a solution for this.
And one more issue, this is actually not related to this particular project but is related to spark sql.
I am getting the following error:
Exception in thread "main" scala.MatchError: class com.test.dto.ReceivedEvent$Payload (of class java.lang.Class) at org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:189) at org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:188)
due to the following line of code: JavaSchemaRDD schemaRDD = sqlContext.applySchema(p, ReceivedEvent.class);
following is an overview of the class:
public class ReceivedEvent implements Serializable{
String event;
Long timestamp;
String eventType;
Payload payload;
Request your assistance.
Thanks,
Hi dibbhatt,
following is the stack trace for the error:
Exception in thread "main" java.lang.ExceptionInInitializerError
Caused by: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:124)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:145)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:151)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:159)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:164)
at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:206)
at akka.actor.ActorSystem$Settings.
If you find a solution for this then it will be of great help.
Thanks,
Not sure if this is a issue with Spark Kafka Consumer. So closing the issue for this . Just FYI..not sure if this will help you : http://apache-spark-user-list.1001560.n3.nabble.com/Packaging-a-spark-job-using-maven-td5615.html
hi Dibbhatt,
I'm not able to figure out why am I not getting the content in my batch. I have added an SOP message in foreachRDD().call method but it seems as if this method is not at all getting executed. Following is my piece of code:
props.put("consumer.forcefromstart", "false"); props.put("consumer.fetchsizebytes", "524288"); props.put("consumer.fillfreqms", "2000");
I am new to Spark so not very comfortable with it, please let me know when this call() method gets executed and do I need a spark cluster to execute this.
Great Thanks,