Closed vishalpsheth closed 8 years ago
In the first case I see that you are trying to load into lineitem with actian user. But, you say that table lineitem is owned by sparktest user, not by actian user.
-vu "actian" -vp "actian" -vh $(hostname -f) -vi $(iigetenv II_INSTALLATION) -vd $SEPPARAMDB -tt lineitem
@cbarca It's fine, the -vu and -vp parameters specify the user/password combination to use to connect to Vector. -tt lineitem and -tt sparktest.lineitem should both work in this scenario (and I have a patch that fixes it).
could you please let me know when this fix will be available for the testing and the patch number for the same. Thanks.
It is fixed for a while. Try to get the lastest 0.1 .jars (as published by Christian).
# Description
Build number 4.3_Hadoop /293 hostname:uksl-indecomm-mapr-clu1.actian.com user/password:actian/actian
Database testdb has two users actian and sparktest table lineitem is own by sparktest user Actian user has been given all privileges to sparktest.lineitem table. Now while loading data to sparktest.lineitem table as actian user below
error is received
Command executed spark-submit --conf "spark.eventLog.dir=/mapr/my.cluster.com/user/actian/eventlog" --class com.actian.spark_vector.loader.Main --master $SPARK_MASTER $SPARK_LOADER_JAR load csv -sf "$_hadoopfile" -vu "actian" -vp "actian" -vh $(hostname -f) -vi $(iigetenv II_INSTALLATION) -vd $SEPPARAMDB -tt lineitem -sc "|" -h "l_orderkey int,l_partkey int,l_suppkey int,l_linenumber int,l_quantity double,l_extendedprice double,l_discount double,l_tax double, l_returnflag string, l_linestatus string, l_shipdate date, l_commitdate date,l_receiptdate date,l_shipinstruct string,l_shipmode string,l_comment string"
Error received
16/03/08 12:18:15 ERROR VectorJDBC: Unable to retrieve metadata for table '${tableName}' java.sql.SQLSyntaxErrorException: Table 'lineitem' does not exist or is not owned by you. at com.ingres.gcf.util.SqlExType.getSqlEx(SqlExType.java:117) at com.ingres.gcf.util.SqlExFactory.get(SqlExFactory.java:96) at com.ingres.gcf.jdbc.DrvObj.readError(DrvObj.java:855) at com.ingres.gcf.jdbc.JdbcStmt.readError(JdbcStmt.java:2978) at com.ingres.gcf.jdbc.DrvObj.readResults(DrvObj.java:639) at com.ingres.gcf.jdbc.JdbcStmt.readResults(JdbcStmt.java:2875) at com.ingres.gcf.jdbc.JdbcStmt.readResults(JdbcStmt.java:2826) at com.ingres.gcf.jdbc.JdbcStmt.exec(JdbcStmt.java:1601) at com.ingres.gcf.jdbc.JdbcStmt.executeQuery(JdbcStmt.java:487) at com.actian.spark_vector.vector.VectorJDBC$$anonfun$5$$anonfun$apply$1.apply(VectorJDBC.scala:79) at com.actian.spark_vector.vector.VectorJDBC$$anonfun$5$$anonfun$apply$1.apply(VectorJDBC.scala:79) at resource.DefaultManagedResource.open(AbstractManagedResource.scala:106) at resource.AbstractManagedResource.acquireFor(AbstractManagedResource.scala:85) at resource.DeferredExtractableManagedResource.either(AbstractManagedResource.scala:29) at com.actian.spark_vector.util.ResourceUtil$RichExtractableManagedResource$.resolve$extension(ResourceUtil.scala:30) at com.actian.spark_vector.vector.VectorJDBC.executeQuery(VectorJDBC.scala:79) at com.actian.spark_vector.vector.VectorJDBC.columnMetadata(VectorJDBC.scala:121) at com.actian.spark_vector.sql.VectorRelation$$anonfun$structType$1.apply(VectorRelation.scala:84) at com.actian.spark_vector.sql.VectorRelation$$anonfun$structType$1.apply(VectorRelation.scala:83) at resource.AbstractManagedResource$$anonfun$5.apply(AbstractManagedResource.scala:86) at scala.util.control.Exception$Catch$$anonfun$either$1.apply(Exception.scala:124) at scala.util.control.Exception$Catch$$anonfun$either$1.apply(Exception.scala:124) at scala.util.control.Exception$Catch.apply(Exception.scala:102) at scala.util.control.Exception$Catch.either(Exception.scala:124) at resource.AbstractManagedResource.acquireFor(AbstractManagedResource.scala:86) at resource.DeferredExtractableManagedResource.either(AbstractManagedResource.scala:29) at com.actian.spark_vector.util.ResourceUtil$RichExtractableManagedResource$.resolve$extension(ResourceUtil.scala:30) at com.actian.spark_vector.vector.VectorJDBC$.withJDBC(VectorJDBC.scala:202) at com.actian.spark_vector.sql.VectorRelation$.structType(VectorRelation.scala:83) at com.actian.spark_vector.sql.VectorRelation$$anonfun$schema$1.apply(VectorRelation.scala:31) at com.actian.spark_vector.sql.VectorRelation$$anonfun$schema$1.apply(VectorRelation.scala:31) at scala.Option.getOrElse(Option.scala:120) at com.actian.spark_vector.sql.VectorRelation.schema(VectorRelation.scala:31) at org.apache.spark.sql.execution.datasources.LogicalRelation.(LogicalRelation.scala:37)
at org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:97)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:140)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:949)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:949)
at org.apache.spark.sql.DataFrame.(DataFrame.scala:144)
at org.apache.spark.sql.DataFrame.(DataFrame.scala:129)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:741)
at com.actian.spark_vector.loader.command.VectorTempTable$.register(VectorTempTable.scala:40)
at com.actian.spark_vector.loader.command.ConstructVector$.execute(ConstructVector.scala:44)
at com.actian.spark_vector.loader.Main$delayedInit$body.apply(Main.scala:29)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at com.actian.spark_vector.loader.Main$.main(Main.scala:25)
at com.actian.spark_vector.loader.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" com.actian.spark_vector.vector.VectorException: Unable to query target table 'lineitem': Table 'lineitem' does not exist or is not owned by you.
at com.actian.spark_vector.vector.VectorJDBC.columnMetadata(VectorJDBC.scala:135)
at com.actian.spark_vector.sql.VectorRelation$$anonfun$structType$1.apply(VectorRelation.scala:84)
at com.actian.spark_vector.sql.VectorRelation$$anonfun$structType$1.apply(VectorRelation.scala:83)
at resource.AbstractManagedResource$$anonfun$5.apply(AbstractManagedResource.scala:86)
at scala.util.control.Exception$Catch$$anonfun$either$1.apply(Exception.scala:124)
at scala.util.control.Exception$Catch$$anonfun$either$1.apply(Exception.scala:124)
at scala.util.control.Exception$Catch.apply(Exception.scala:102)
at scala.util.control.Exception$Catch.either(Exception.scala:124)
at resource.AbstractManagedResource.acquireFor(AbstractManagedResource.scala:86)
at resource.DeferredExtractableManagedResource.either(AbstractManagedResource.scala:29)
at com.actian.spark_vector.util.ResourceUtil$RichExtractableManagedResource$.resolve$extension(ResourceUtil.scala:30)
at com.actian.spark_vector.vector.VectorJDBC$.withJDBC(VectorJDBC.scala:202)
at com.actian.spark_vector.sql.VectorRelation$.structType(VectorRelation.scala:83)
at com.actian.spark_vector.sql.VectorRelation$$anonfun$schema$1.apply(VectorRelation.scala:31)
at com.actian.spark_vector.sql.VectorRelation$$anonfun$schema$1.apply(VectorRelation.scala:31)
at scala.Option.getOrElse(Option.scala:120)
at com.actian.spark_vector.sql.VectorRelation.schema(VectorRelation.scala:31)
at org.apache.spark.sql.execution.datasources.LogicalRelation.(LogicalRelation.scala:37)
at org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:97)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:140)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:949)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:949)
at org.apache.spark.sql.DataFrame.(DataFrame.scala:144)
at org.apache.spark.sql.DataFrame.(DataFrame.scala:129)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:741)
at com.actian.spark_vector.loader.command.VectorTempTable$.register(VectorTempTable.scala:40)
at com.actian.spark_vector.loader.command.ConstructVector$.execute(ConstructVector.scala:44)
at com.actian.spark_vector.loader.Main$delayedInit$body.apply(Main.scala:29)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at com.actian.spark_vector.loader.Main$.main(Main.scala:25)
at com.actian.spark_vector.loader.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
When i tried with -tt "sparktest.lineitem" got belo error
Exception in thread "main" org.apache.spark.sql.AnalysisException: Specifying database name or other qualifiers are not allowed for temporary tables. If the table name has dots (.) in it, please quote the table name with backticks (`).; at org.apache.spark.sql.catalyst.analysis.Catalog$class.checkTableIdentifier(Catalog.scala:97) at org.apache.spark.sql.catalyst.analysis.SimpleCatalog.checkTableIdentifier(Catalog.scala:104) at org.apache.spark.sql.catalyst.analysis.SimpleCatalog.registerTable(Catalog.scala:110) at org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:95) at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:140) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:138) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138) at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:949) at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:949) at org.apache.spark.sql.DataFrame.(DataFrame.scala:144)
at org.apache.spark.sql.DataFrame.(DataFrame.scala:129)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:741)
at com.actian.spark_vector.loader.command.CSVRead$.registerTempTable(CSVRead.scala:48)
at com.actian.spark_vector.loader.command.ConstructVector$.execute(ConstructVector.scala:39)
at com.actian.spark_vector.loader.Main$delayedInit$body.apply(Main.scala:29)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at com.actian.spark_vector.loader.Main$.main(Main.scala:25)
at com.actian.spark_vector.loader.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
# Steps To Reproduce:
loadSP export PATH=/opt/mapr/spark/spark-1.5.2/bin/:$PATH export SPARK_MASTER=yarn export HDFS_TMP=/mapr/my.cluster.com/Actian/tmp export SPARK_LOADER_JAR=/home/actian/vishal/Spark/spark-loader/spark_vector_loader-assembly-1.0-SNAPSHOT.jar export SEPPARAMDB=testdb export _hadoopfile=/mapr/my.cluster.com/Actian/tmp/parts/x01
spark-submit --conf "spark.eventLog.dir=/mapr/my.cluster.com/user/actian/eventlog" --class com.actian.spark_vector.loader.Main --master $SPARK_MASTER $SPARK_LOADER_JAR load csv -sf "$_hadoopfile" -vu "actian" -vp "actian" -vh $(hostname -f) -vi $(iigetenv II_INSTALLATION) -vd $SEPPARAMDB -tt lineitem -sc "|" -h "l_orderkey int,l_partkey int,l_suppkey int,l_linenumber int,l_quantity double,l_extendedprice double,l_discount double,l_tax double, l_returnflag string, l_linestatus string, l_shipdate date, l_commitdate date,l_receiptdate date,l_shipinstruct string,l_shipmode string,l_comment string"
Additional Information
Same behaviour is observed from spark shell in side spark shell on running below scala command for same scenario
sqlContext.sql("""CREATE TEMPORARY TABLE lineitem USING com.actian.spark_vector.sql.DefaultSource OPTIONS (host "uksl-indecomm-mapr-clu1.actian.com",instance "SP",database "testdb",table "sparktest.lineitem",user "actian",password "actian")""")
Error received java.sql.SQLSyntaxErrorException: Table 'sparktest.lineitem' does not exist or is not owned by you. at com.ingres.gcf.util.SqlExType.getSqlEx(SqlExType.java:117) at com.ingres.gcf.util.SqlExFactory.get(SqlExFactory.java:96) at com.ingres.gcf.jdbc.DrvObj.readError(DrvObj.java:855) at com.ingres.gcf.jdbc.JdbcStmt.readError(JdbcStmt.java:2978) at com.ingres.gcf.jdbc.DrvObj.readResults(DrvObj.java:639) at com.ingres.gcf.jdbc.JdbcStmt.readResults(JdbcStmt.java:2875) at com.ingres.gcf.jdbc.JdbcStmt.readResults(JdbcStmt.java:2826) at com.ingres.gcf.jdbc.JdbcStmt.exec(JdbcStmt.java:1601) at com.ingres.gcf.jdbc.JdbcStmt.executeQuery(JdbcStmt.java:487) at com.actian.spark_vector.vector.VectorJDBC$$anonfun$5$$anonfun$apply$1.apply(VectorJDBC.scala:79) at com.actian.spark_vector.vector.VectorJDBC$$anonfun$5$$anonfun$apply$1.apply(VectorJDBC.scala:79) at resource.DefaultManagedResource.open(AbstractManagedResource.scala:106) at resource.AbstractManagedResource.acquireFor(AbstractManagedResource.scala:85) at resource.DeferredExtractableManagedResource.either(AbstractManagedResource.scala:29) at com.actian.spark_vector.util.ResourceUtil$RichExtractableManagedResource$.resolve$extension(ResourceUtil.scala:30) at com.actian.spark_vector.vector.VectorJDBC.executeQuery(VectorJDBC.scala:79) at com.actian.spark_vector.vector.VectorJDBC.columnMetadata(VectorJDBC.scala:121) at com.actian.spark_vector.sql.VectorRelation$$anonfun$structType$1.apply(VectorRelation.scala:84) at com.actian.spark_vector.sql.VectorRelation$$anonfun$structType$1.apply(VectorRelation.scala:83) at resource.AbstractManagedResource$$anonfun$5.apply(AbstractManagedResource.scala:86) at scala.util.control.Exception$Catch$$anonfun$either$1.apply(Exception.scala:124) at scala.util.control.Exception$Catch$$anonfun$either$1.apply(Exception.scala:124) at scala.util.control.Exception$Catch.apply(Exception.scala:102) at scala.util.control.Exception$Catch.either(Exception.scala:124) at resource.AbstractManagedResource.acquireFor(AbstractManagedResource.scala:86) at resource.DeferredExtractableManagedResource.either(AbstractManagedResource.scala:29) at com.actian.spark_vector.util.ResourceUtil$RichExtractableManagedResource$.resolve$extension(ResourceUtil.scala:30) at com.actian.spark_vector.vector.VectorJDBC$.withJDBC(VectorJDBC.scala:202) at com.actian.spark_vector.sql.VectorRelation$.structType(VectorRelation.scala:83) at com.actian.spark_vector.sql.VectorRelation$$anonfun$schema$1.apply(VectorRelation.scala:31) at com.actian.spark_vector.sql.VectorRelation$$anonfun$schema$1.apply(VectorRelation.scala:31) at scala.Option.getOrElse(Option.scala:120) at com.actian.spark_vector.sql.VectorRelation.schema(VectorRelation.scala:31) at org.apache.spark.sql.execution.datasources.LogicalRelation.(LogicalRelation.scala:37)
at org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:97)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:140)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:949)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:949)
at org.apache.spark.sql.DataFrame.(DataFrame.scala:144)
at org.apache.spark.sql.DataFrame.(DataFrame.scala:129)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:741)
at $line15.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:20)
at $line15.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:25)
at $line15.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:27)
at $line15.$read$$iwC$$iwC$$iwC$$iwC$$iwC.(:29)
at $line15.$read$$iwC$$iwC$$iwC$$iwC.(:31)
at $line15.$read$$iwC$$iwC$$iwC.(:33)
at $line15.$read$$iwC$$iwC.(:35)
at $line15.$read$$iwC.(:37)
at $line15.$read.(:39)
at $line15.$read$.(:43)
at $line15.$read$.()
at $line15.$eval$.(:7)
at $line15.$eval$.()
at $line15.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
com.actian.spark_vector.vector.VectorException: Unable to query target table 'sparktest.lineitem': Table 'sparktest.lineitem' does not exist or is not owned by you.
at com.actian.spark_vector.vector.VectorJDBC.columnMetadata(VectorJDBC.scala:135)
at com.actian.spark_vector.sql.VectorRelation$$anonfun$structType$1.apply(VectorRelation.scala:84)
at com.actian.spark_vector.sql.VectorRelation$$anonfun$structType$1.apply(VectorRelation.scala:83)
at resource.AbstractManagedResource$$anonfun$5.apply(AbstractManagedResource.scala:86)
at scala.util.control.Exception$Catch$$anonfun$either$1.apply(Exception.scala:124)
at scala.util.control.Exception$Catch$$anonfun$either$1.apply(Exception.scala:124)
at scala.util.control.Exception$Catch.apply(Exception.scala:102)
at scala.util.control.Exception$Catch.either(Exception.scala:124)
at resource.AbstractManagedResource.acquireFor(AbstractManagedResource.scala:86)
at resource.DeferredExtractableManagedResource.either(AbstractManagedResource.scala:29)
at com.actian.spark_vector.util.ResourceUtil$RichExtractableManagedResource$.resolve$extension(ResourceUtil.scala:30)
at com.actian.spark_vector.vector.VectorJDBC$.withJDBC(VectorJDBC.scala:202)
at com.actian.spark_vector.sql.VectorRelation$.structType(VectorRelation.scala:83)
at com.actian.spark_vector.sql.VectorRelation$$anonfun$schema$1.apply(VectorRelation.scala:31)
at com.actian.spark_vector.sql.VectorRelation$$anonfun$schema$1.apply(VectorRelation.scala:31)
at scala.Option.getOrElse(Option.scala:120)
at com.actian.spark_vector.sql.VectorRelation.schema(VectorRelation.scala:31)
at org.apache.spark.sql.execution.datasources.LogicalRelation.(LogicalRelation.scala:37)
at org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:97)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:140)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:949)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:949)
at org.apache.spark.sql.DataFrame.(DataFrame.scala:144)
at org.apache.spark.sql.DataFrame.(DataFrame.scala:129)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:741)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:20)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:25)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:27)
at $iwC$$iwC$$iwC$$iwC$$iwC.(:29)
at $iwC$$iwC$$iwC$$iwC.(:31)
at $iwC$$iwC$$iwC.(:33)
at $iwC$$iwC.(:35)
at $iwC.(:37)
at (:39)
at .(:43)
at .()
at .(:7)
at .()
at $print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)