sasha-polev / aerospark

Aerospike Spark Connector
Apache License 2.0
35 stars 38 forks source link

NPE i'm not sure if it related to empty table or not. #12

Closed sasi21033 closed 8 years ago

sasi21033 commented 8 years ago

java.lang.NullPointerException at com.osscube.spark.aerospike.rdd.AeroRelation.schema(AeroRelation.scala:78) at org.apache.spark.sql.execution.datasources.LogicalRelation.(LogicalRelation.scala:31) at org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:97) at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:140) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:138) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138) at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:927) at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:927) at org.apache.spark.sql.DataFrame.(DataFrame.scala:144) at org.apache.spark.sql.DataFrame.(DataFrame.scala:129) at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51) at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:719)

sasha-polev commented 8 years ago

Hi Sasi

We taken an approach of automatically discovering data types of the bins based on one row of data. In Aoerspike bins do not have type so we can query this metadata in any other way. So it will never work to define dataframe on empty set (in principle impossible, unless someone suggest us other workable solution). We can replace the error to something meaningful, but we cant make it work.

The only thing you can do with empty set is to define aeroRDD without schema and then apply schema manually.

Thanks Sasha