TIBCOSoftware / snappydata

Project SnappyData - memory optimized analytics database, based on Apache Spark™ and Apache Geode™. Stream, Transact, Analyze, Predict in one cluster
http://www.snappydata.io
Other
1.04k stars 201 forks source link

Getting Exception java.lang.ClassNotFoundException: org.apache.spark.sql.execution.datasources.CaseInsensitiveMap #512

Open rahulbsw opened 7 years ago

rahulbsw commented 7 years ago

Hi I am trying below Article on Databricks Community portal

https://dzone.com/articles/joining-a-billion-rows-20x-faster-than-apache-spar?utm_campaign=Feed:%20dzone%2Fperformance&utm_medium=feed&utm_source=feedpress.me

I have attached jar from spark-package, snappydata-0.7-s_2.11 , i am trying on spark 2.1- scala 2.11 cluster // turn off compression for this test //val snappy = new SnappySession(spark.sparkContext) //val snappy = new org.apache.spark.sql.SnappySession(spark.sparkContext) val snappy = new org.apache.spark.sql.SnappySession(spark.sparkContext) snappy.sql("set spark.sql.inMemoryColumnarStorage.compressed=false") snappy.sql("drop table if exists rangeTest") snappy.sql("create table rangeTest (id bigint not null) using column") snappy.range(1000L * 1000 * 1000).write.insertInto("rangeTest") snappy.table("rangeTest").selectExpr("sum(id)").show() //benchmark("SnappyData (sum of a billion)") { snappy.table("rangeTest").selectExpr("sum(id)").show() //} And getting below exception java.lang.NoClassDefFoundError: org/apache/spark/sql/execution/datasources/CaseInsensitiveMap

Printstack

java.lang.NoClassDefFoundError: org/apache/spark/sql/execution/datasources/CaseInsensitiveMap at org.apache.spark.sql.SnappyContext$.<init>(SnappyContext.scala:795) at org.apache.spark.sql.SnappyContext$.<clinit>(SnappyContext.scala) at org.apache.spark.sql.internal.SnappyConf.<init>(SnappySessionState.scala:271) at org.apache.spark.sql.internal.SnappySessionState.conf$lzycompute(SnappySessionState.scala:152) at org.apache.spark.sql.internal.SnappySessionState.conf(SnappySessionState.scala:152) at org.apache.spark.sql.internal.SnappySessionState.conf(SnappySessionState.scala:53) at org.apache.spark.sql.internal.SessionState$$anonfun$1.apply(SessionState.scala:171) at org.apache.spark.sql.internal.SessionState$$anonfun$1.apply(SessionState.scala:170) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:170) at org.apache.spark.sql.internal.SnappySessionState.<init>(SnappySessionState.scala:54) at org.apache.spark.sql.SnappySession.liftedTree1$1(SnappySession.scala:113) at org.apache.spark.sql.SnappySession.sessionState$lzycompute(SnappySession.scala:106) at org.apache.spark.sql.SnappySession.sessionState(SnappySession.scala:105) at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:121) at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:73) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:37) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:60) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw$$iw$$iw$$iw.<init>(<console>:62) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw$$iw$$iw.<init>(<console>:64) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw$$iw.<init>(<console>:66) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw.<init>(<console>:68) at linef7b637e9c12944ad9f60a68b19e83b8225.$eval$.$print$lzycompute(<console>:7) at linef7b637e9c12944ad9f60a68b19e83b8225.$eval$.$print(<console>:6) Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.execution.datasources.CaseInsensitiveMap at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader.loadClass(ClassLoaders.scala:150) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.sql.SnappyContext$.<init>(SnappyContext.scala:795) at org.apache.spark.sql.SnappyContext$.<clinit>(SnappyContext.scala) at org.apache.spark.sql.internal.SnappyConf.<init>(SnappySessionState.scala:271) at org.apache.spark.sql.internal.SnappySessionState.conf$lzycompute(SnappySessionState.scala:152) at org.apache.spark.sql.internal.SnappySessionState.conf(SnappySessionState.scala:152) at org.apache.spark.sql.internal.SnappySessionState.conf(SnappySessionState.scala:53) at org.apache.spark.sql.internal.SessionState$$anonfun$1.apply(SessionState.scala:171) at org.apache.spark.sql.internal.SessionState$$anonfun$1.apply(SessionState.scala:170) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:170) at org.apache.spark.sql.internal.SnappySessionState.<init>(SnappySessionState.scala:54) at org.apache.spark.sql.SnappySession.liftedTree1$1(SnappySession.scala:113) at org.apache.spark.sql.SnappySession.sessionState$lzycompute(SnappySession.scala:106) at org.apache.spark.sql.SnappySession.sessionState(SnappySession.scala:105) at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:121) at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:73) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:37) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:60) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw$$iw$$iw$$iw.<init>(<console>:62) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw$$iw$$iw.<init>(<console>:64) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw$$iw.<init>(<console>:66) at linef7b637e9c12944ad9f60a68b19e83b8225.$read$$iw.<init>(<console>:68) at linef7b637e9c12944ad9f60a68b19e83b8225.$eval$.$print$lzycompute(<console>:7) at linef7b637e9c12944ad9f60a68b19e83b8225.$eval$.$print(<console>:6)

ymahajan commented 7 years ago

Hi rahulbsw,

SnappyData doesn't support Spark 2.1 yet. We are working on integrating it with Spark2.1 and should be available in next releases of SnappyData.

Please try with Spark 2.0.2 version.

rahulbsw commented 7 years ago

Hi @ymahajan I tried Spark 2.0.2 version i am getting below exception

NanoTimer::Problem loading library from URL path: /tmp/libgemfirexd64.so: java.lang.UnsatisfiedLinkError: no gemfirexd64 in java.library.path
NanoTimer::Problem loading library from URL path: /tmp/libgemfirexd64.so: java.lang.UnsatisfiedLinkError: no gemfirexd64 in java.library.path

[TRACE 2017/02/18 21:30:09.159 UTC GFXD:error:FabricServiceAPI <WRAPPER-ReplId-6fe37-6eed6-ca91c-3> tid=0xb5] XJ040 error occurred while starting server : java.sql.SQLException(XJ040): Failed to start database 'gemfirexd', see the cause for details.
java.sql.SQLException(XJ040): Failed to start database 'gemfirexd', see the cause for details.
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedSQLException.wrapStandardException(EmbedSQLException.java:158)
    at com.pivotal.gemfirexd.internal.impl.jdbc.TransactionResourceImpl.wrapInSQLException(TransactionResourceImpl.java:722)
    at com.pivotal.gemfirexd.internal.impl.jdbc.TransactionResourceImpl.handleException(TransactionResourceImpl.java:652)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection.handleException(EmbedConnection.java:2944)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection.<init>(EmbedConnection.java:701)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection30.<init>(EmbedConnection30.java:94)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection40.<init>(EmbedConnection40.java:75)
    at com.pivotal.gemfirexd.internal.jdbc.Driver40.getNewEmbedConnection(Driver40.java:95)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:351)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:219)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:195)
    at com.pivotal.gemfirexd.internal.jdbc.AutoloadedDriver.connect(AutoloadedDriver.java:141)
    at com.pivotal.gemfirexd.internal.engine.fabricservice.FabricServiceImpl.startImpl(FabricServiceImpl.java:290)
    at com.pivotal.gemfirexd.internal.engine.fabricservice.FabricServerImpl.start(FabricServerImpl.java:60)
    at io.snappydata.impl.ServerImpl.start(ServerImpl.scala:32)
    at io.snappydata.util.ServiceUtils$.invokeStartFabricServer(ServiceUtils.scala:69)
    at org.apache.spark.sql.SnappyContext$.invokeServices(SnappyContext.scala:1049)
    at org.apache.spark.sql.SnappyContext$.initGlobalSnappyContext(SnappyContext.scala:1013)
    at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:123)
    at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:73)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw$$iw$$iw.<init>(<console>:34)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw$$iw.<init>(<console>:57)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw.<init>(<console>:59)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw.<init>(<console>:61)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$eval$.$print$lzycompute(<console>:7)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$eval$.$print(<console>:6)
Caused by: ERROR XJ040: Failed to start database 'gemfirexd', see the cause for details.
    at com.pivotal.gemfirexd.internal.iapi.error.StandardException.newException(StandardException.java:473)
    at com.pivotal.gemfirexd.internal.engine.db.FabricDatabase.postCreate(FabricDatabase.java:585)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection.<init>(EmbedConnection.java:656)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection30.<init>(EmbedConnection30.java:94)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection40.<init>(EmbedConnection40.java:75)
    at com.pivotal.gemfirexd.internal.jdbc.Driver40.getNewEmbedConnection(Driver40.java:95)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:351)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:219)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:195)
    at com.pivotal.gemfirexd.internal.jdbc.AutoloadedDriver.connect(AutoloadedDriver.java:141)
    at com.pivotal.gemfirexd.internal.engine.fabricservice.FabricServiceImpl.startImpl(FabricServiceImpl.java:290)
    at com.pivotal.gemfirexd.internal.engine.fabricservice.FabricServerImpl.start(FabricServerImpl.java:60)
    at io.snappydata.impl.ServerImpl.start(ServerImpl.scala:32)
    at io.snappydata.util.ServiceUtils$.invokeStartFabricServer(ServiceUtils.scala:69)
    at org.apache.spark.sql.SnappyContext$.invokeServices(SnappyContext.scala:1049)
    at org.apache.spark.sql.SnappyContext$.initGlobalSnappyContext(SnappyContext.scala:1013)
    at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:123)
    at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:73)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw$$iw$$iw.<init>(<console>:34)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw$$iw.<init>(<console>:57)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw.<init>(<console>:59)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw.<init>(<console>:61)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$eval$.$print$lzycompute(<console>:7)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$eval$.$print(<console>:6)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: javax.jdo.JDODataStoreException: Required table missing : "DBS" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.autoCreateTables"
NestedThrowables:
org.datanucleus.store.rdbms.exceptions.MissingTableException: Required table missing : "DBS" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.autoCreateTables"
    at com.pivotal.gemfirexd.internal.engine.store.GemFireStore.initExternalCatalog(GemFireStore.java:2312)
    at com.pivotal.gemfirexd.internal.engine.db.FabricDatabase.postCreate(FabricDatabase.java:537)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection.<init>(EmbedConnection.java:656)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection30.<init>(EmbedConnection30.java:94)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection40.<init>(EmbedConnection40.java:75)
    at com.pivotal.gemfirexd.internal.jdbc.Driver40.getNewEmbedConnection(Driver40.java:95)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:351)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:219)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:195)
    at com.pivotal.gemfirexd.internal.jdbc.AutoloadedDriver.connect(AutoloadedDriver.java:141)
    at com.pivotal.gemfirexd.internal.engine.fabricservice.FabricServiceImpl.startImpl(FabricServiceImpl.java:290)
    at com.pivotal.gemfirexd.internal.engine.fabricservice.FabricServerImpl.start(FabricServerImpl.java:60)
    at io.snappydata.impl.ServerImpl.start(ServerImpl.scala:32)
    at io.snappydata.util.ServiceUtils$.invokeStartFabricServer(ServiceUtils.scala:69)
    at org.apache.spark.sql.SnappyContext$.invokeServices(SnappyContext.scala:1049)
    at org.apache.spark.sql.SnappyContext$.initGlobalSnappyContext(SnappyContext.scala:1013)
    at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:123)
    at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:73)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw$$iw$$iw.<init>(<console>:34)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw$$iw.<init>(<console>:57)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw.<init>(<console>:59)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw.<init>(<console>:61)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$eval$.$print$lzycompute(<console>:7)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$eval$.$print(<console>:6)
Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: javax.jdo.JDODataStoreException: Required table missing : "DBS" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.autoCreateTables"
NestedThrowables:
org.datanucleus.store.rdbms.exceptions.MissingTableException: Required table missing : "DBS" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.autoCreateTables"
    at io.snappydata.impl.SnappyHiveCatalog.<init>(SnappyHiveCatalog.java:82)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at java.lang.Class.newInstance(Class.java:442)
    at com.pivotal.gemfirexd.internal.engine.store.GemFireStore.initExternalCatalog(GemFireStore.java:2296)
    at com.pivotal.gemfirexd.internal.engine.db.FabricDatabase.postCreate(FabricDatabase.java:537)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection.<init>(EmbedConnection.java:656)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection30.<init>(EmbedConnection30.java:94)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection40.<init>(EmbedConnection40.java:75)
    at com.pivotal.gemfirexd.internal.jdbc.Driver40.getNewEmbedConnection(Driver40.java:95)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:351)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:219)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:195)
    at com.pivotal.gemfirexd.internal.jdbc.AutoloadedDriver.connect(AutoloadedDriver.java:141)
    at com.pivotal.gemfirexd.internal.engine.fabricservice.FabricServiceImpl.startImpl(FabricServiceImpl.java:290)
    at com.pivotal.gemfirexd.internal.engine.fabricservice.FabricServerImpl.start(FabricServerImpl.java:60)
    at io.snappydata.impl.ServerImpl.start(ServerImpl.scala:32)
    at io.snappydata.util.ServiceUtils$.invokeStartFabricServer(ServiceUtils.scala:69)
    at org.apache.spark.sql.SnappyContext$.invokeServices(SnappyContext.scala:1049)
    at org.apache.spark.sql.SnappyContext$.initGlobalSnappyContext(SnappyContext.scala:1013)
    at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:123)
    at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:73)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw$$iw$$iw.<init>(<console>:34)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw$$iw.<init>(<console>:57)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw.<init>(<console>:59)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw.<init>(<console>:61)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$eval$.$print$lzycompute(<console>:7)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$eval$.$print(<console>:6)
Caused by: java.util.concurrent.ExecutionException: javax.jdo.JDODataStoreException: Required table missing : "DBS" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.autoCreateTables"
NestedThrowables:
org.datanucleus.store.rdbms.exceptions.MissingTableException: Required table missing : "DBS" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.autoCreateTables"
    at java.util.concurrent.FutureTask.report(FutureTask.java:122)
    at java.util.concurrent.FutureTask.get(FutureTask.java:192)
    at io.snappydata.impl.SnappyHiveCatalog.<init>(SnappyHiveCatalog.java:80)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at java.lang.Class.newInstance(Class.java:442)
    at com.pivotal.gemfirexd.internal.engine.store.GemFireStore.initExternalCatalog(GemFireStore.java:2296)
    at com.pivotal.gemfirexd.internal.engine.db.FabricDatabase.postCreate(FabricDatabase.java:537)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection.<init>(EmbedConnection.java:656)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection30.<init>(EmbedConnection30.java:94)
    at com.pivotal.gemfirexd.internal.impl.jdbc.EmbedConnection40.<init>(EmbedConnection40.java:75)
    at com.pivotal.gemfirexd.internal.jdbc.Driver40.getNewEmbedConnection(Driver40.java:95)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:351)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:219)
    at com.pivotal.gemfirexd.internal.jdbc.InternalDriver.connect(InternalDriver.java:195)
    at com.pivotal.gemfirexd.internal.jdbc.AutoloadedDriver.connect(AutoloadedDriver.java:141)
    at com.pivotal.gemfirexd.internal.engine.fabricservice.FabricServiceImpl.startImpl(FabricServiceImpl.java:290)
    at com.pivotal.gemfirexd.internal.engine.fabricservice.FabricServerImpl.start(FabricServerImpl.java:60)
    at io.snappydata.impl.ServerImpl.start(ServerImpl.scala:32)
    at io.snappydata.util.ServiceUtils$.invokeStartFabricServer(ServiceUtils.scala:69)
    at org.apache.spark.sql.SnappyContext$.invokeServices(SnappyContext.scala:1049)
    at org.apache.spark.sql.SnappyContext$.initGlobalSnappyContext(SnappyContext.scala:1013)
    at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:123)
    at org.apache.spark.sql.SnappySession.<init>(SnappySession.scala:73)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw$$iw$$iw.<init>(<console>:34)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw$$iw.<init>(<console>:57)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw$$iw.<init>(<console>:59)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$read$$iw.<init>(<console>:61)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$eval$.$print$lzycompute(<console>:7)
    at line0f2aabcbfeba40dcb0496efc2693b6f525.$eval$.$print(<console>:6)
Caused by: javax.jdo.JDODataStoreException: Required table missing : "DBS" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.autoCreateTables"
NestedThrowables:
org.datanucleus.store.rdbms.exceptions.MissingTableException: Required table missing : "DBS" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.autoCreateTables"
    at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:461)
    at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:732)
    at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)
    at org.apache.hadoop.hive.metastore.ObjectStore.createDatabase(ObjectStore.java:521)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
    at com.sun.proxy.$Proxy25.createDatabase(Unknown Source)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:604)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:181)
    at io.snappydata.impl.SnappyHiveCatalog$HMSQuery.initHMC(SnappyHiveCatalog.java:259)
    at io.snappydata.impl.SnappyHiveCatalog$HMSQuery.call(SnappyHiveCatalog.java:193)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.datanucleus.store.rdbms.exceptions.MissingTableException: Required table missing : "DBS" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.autoCreateTables"
    at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:485)
    at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3380)
    at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190)
    at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841)
    at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
    at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605)
    at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954)
    at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679)
    at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2045)
    at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365)
    at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827)
    at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java:2571)
    at org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513)
    at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:232)
    at org.datanucleus.ExecutionContextImpl.newObjectProviderForPersistentNew(ExecutionContextImpl.java:1414)
    at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2218)
    at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065)
    at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913)
    at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)
    at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727)
    at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)
    at org.apache.hadoop.hive.metastore.ObjectStore.createDatabase(ObjectStore.java:521)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
    at com.sun.proxy.$Proxy25.createDatabase(Unknown Source)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:604)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:181)
    at io.snappydata.impl.SnappyHiveCatalog$HMSQuery.initHMC(SnappyHiveCatalog.java:259)
    at io.snappydata.impl.SnappyHiveCatalog$HMSQuery.call(SnappyHiveCatalog.java:193)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
ymahajan commented 7 years ago

@rahulbsw Can you please describe the steps you tried ? including the command to launch spark-shell?

rahulbsw commented 7 years ago

@ymahajan I am using Databricks Community Notebook similar like Zeppelin/iPython notebook . All i need to create a EC2 Databricks cluster (I can choose version of spark and resources ) and attached required jar (in this case it is snappydata from maven repo spark-package). I have attached snapshot below Notepbook

snappydata-databrick

Cluster image