Qihoo360 / XSQL

Unified SQL Analytics Engine Based on SparkSQL
https://qihoo360.github.io/XSQL/
Apache License 2.0
210 stars 62 forks source link

xsql.conf文件中配置多数据源无法运行 #58

Closed dawsongzhao closed 5 years ago

dawsongzhao commented 5 years ago

1、只配置mysql数据源,运行OK

spark.xsql.datasources                     default
spark.xsql.default.database                linkis
spark.xsql.datasource.default.type         mysql
spark.xsql.datasource.default.url          jdbc:mysql://172.19.101.47:3306
spark.xsql.datasource.default.user         root
spark.xsql.datasource.default.password     Pwd@123456
spark.xsql.datasource.default.version      5.6.19

2、只配置hdp 3.1.0 Hive数据源,运行OK

spark.xsql.datasources default
spark.xsql.datasource.default.type             hive
spark.xsql.datasource.default.metastore.url   thrift://hdfs02-dev.yingzi.com:9083
spark.xsql.datasource.default.user             test
spark.xsql.datasource.default.password        test
spark.xsql.datasource.default.version         3.1.0

3、同时配置mysql、hive数据源,执行报错

spark.xsql.datasources                     default
spark.xsql.default.database                linkis
spark.xsql.datasource.default.type         mysql
spark.xsql.datasource.default.url          jdbc:mysql://172.19.101.47:3306
spark.xsql.datasource.default.user         root
spark.xsql.datasource.default.password     Pwd@123456
spark.xsql.datasource.default.version      5.6.19

spark.xsql.datasources defaulthive
spark.xsql.datasource.defaulthive.type             hive
spark.xsql.datasource.defaulthive.metastore.url   thrift://hdfs02-dev.yingzi.com:9083
spark.xsql.datasource.defaulthive.user             test
spark.xsql.datasource.defaulthive.password        test
spark.xsql.datasource.defaulthive.version         3.1.0

错误如下:


19/10/28 18:18:33 INFO SharedState: Warehouse path is 'file:/data/bigdata/xsql/xsql/spark-warehouse'.
19/10/28 18:18:34 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
19/10/28 18:18:34 INFO XSQLExternalCatalog: reading xsql configuration from /data/bigdata/xsql/xsql/conf/xsql.conf
Exception in thread "main" java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.xsql.XSQLExternalCatalog':
        at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:223)
        at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:104)
        at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:103)
        at org.apache.spark.sql.xsql.XSQLSessionStateBuilder.externalCatalog(XSQLSessionStateBuilder.scala:60)
        at org.apache.spark.sql.xsql.XSQLSessionStateBuilder.catalog$lzycompute(XSQLSessionStateBuilder.scala:73)
        at org.apache.spark.sql.xsql.XSQLSessionStateBuilder.catalog(XSQLSessionStateBuilder.scala:71)
        at org.apache.spark.sql.xsql.XSQLSessionStateBuilder.catalog(XSQLSessionStateBuilder.scala:57)
        at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$1.apply(BaseSessionStateBuilder.scala:291)
        at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$1.apply(BaseSessionStateBuilder.scala:291)
        at org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:77)
        at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:77)
        at org.apache.spark.sql.xsql.ResolveScanSingleTable.<init>(XSQLStrategies.scala:69)
        at org.apache.spark.sql.xsql.shell.SparkXSQLShell$.main(SparkXSQLShell.scala:55)
        at org.apache.spark.sql.xsql.shell.SparkXSQLShell.main(SparkXSQLShell.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.spark.SparkException: default data source must configured!
        at org.apache.spark.sql.xsql.XSQLExternalCatalog.setupAndInitMetadata(XSQLExternalCatalog.scala:160)
        at org.apache.spark.sql.xsql.XSQLExternalCatalog.<init>(XSQLExternalCatalog.scala:119)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:214)
        ... 25 more
dawsongzhao commented 5 years ago

已解决,参考文档发现配置有错: 修改如下OK

spark.xsql.datasources                     default,defaulthive

spark.xsql.default.database                linkis
spark.xsql.datasource.default.type         mysql
spark.xsql.datasource.default.url          jdbc:mysql://172.19.101.47:3306
spark.xsql.datasource.default.user         root
spark.xsql.datasource.default.password     Pwd@123456
spark.xsql.datasource.default.version      5.6.19

spark.xsql.datasource.defaulthive.type             hive
spark.xsql.datasource.defaulthive.metastore.url   thrift://hdfs02-dev.yingzi.com:9083
spark.xsql.datasource.defaulthive.user             test
spark.xsql.datasource.defaulthive.password        test
spark.xsql.datasource.defaulthive.version         3.1.0