Qihoo360 / XSQL

Unified SQL Analytics Engine Based on SparkSQL
https://qihoo360.github.io/XSQL/
Apache License 2.0
209 stars 62 forks source link

编译完整版后运行spark-xsql报异常 #69

Closed yangbo3 closed 4 years ago

yangbo3 commented 4 years ago

环境信息: CentOS Linux release 7.4.1708 (Core) git branch

配置信息: vim ./conf/xsql.conf spark.xsql.datasources default spark.xsql.default.database test spark.xsql.datasource.default.type mysql spark.xsql.datasource.default.url jdbc:mysql://10.10.1.41:3306 spark.xsql.datasource.default.user test spark.xsql.datasource.default.password test@123 spark.xsql.datasource.default.version 5.7.22

执行信息: ./spark-xsql 20/01/08 21:20:10 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 20/01/08 21:20:11 INFO SparkContext: Running Spark version 2.4.3 20/01/08 21:20:11 INFO SparkContext: Submitted application: org.apache.spark.sql.xsql.shell.SparkXSQLShell 20/01/08 21:20:11 INFO SecurityManager: Changing view acls to: root 20/01/08 21:20:11 INFO SecurityManager: Changing modify acls to: root 20/01/08 21:20:11 INFO SecurityManager: Changing view acls groups to: 20/01/08 21:20:11 INFO SecurityManager: Changing modify acls groups to: 20/01/08 21:20:11 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 20/01/08 21:20:11 INFO Utils: Successfully started service 'sparkDriver' on port 38039. 20/01/08 21:20:11 INFO SparkEnv: Registering MapOutputTracker 20/01/08 21:20:11 INFO SparkEnv: Registering BlockManagerMaster 20/01/08 21:20:11 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 20/01/08 21:20:11 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 20/01/08 21:20:11 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-799b3235-6aa8-4068-b013-f5caa4614c90 20/01/08 21:20:11 INFO MemoryStore: MemoryStore started with capacity 2.5 GB 20/01/08 21:20:11 INFO SparkEnv: Registering OutputCommitCoordinator 20/01/08 21:20:11 INFO Utils: Successfully started service 'SparkUI' on port 4040. 20/01/08 21:20:11 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://hdp-client-02:4040 20/01/08 21:20:11 INFO SparkContext: Added JAR file:/data/tools/xsql-0.6.0/xsql-0.6.1-bin-spark-2.4.3/jars/xsql-shell_2.11-0.6.1-SNAPSHOT.jar at spark://hdp-client-02:38039/jars/xsql-shell_2.11-0.6.1-SNAPSHOT.jar with timestamp 1578489611959 20/01/08 21:20:12 INFO Executor: Starting executor ID driver on host localhost 20/01/08 21:20:12 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45532. 20/01/08 21:20:12 INFO NettyBlockTransferService: Server created on hdp-client-02:45532 20/01/08 21:20:12 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 20/01/08 21:20:12 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, hdp-client-02, 45532, None) 20/01/08 21:20:12 INFO BlockManagerMasterEndpoint: Registering block manager hdp-client-02:45532 with 2.5 GB RAM, BlockManagerId(driver, hdp-client-02, 45532, None) 20/01/08 21:20:12 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, hdp-client-02, 45532, None) 20/01/08 21:20:12 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, hdp-client-02, 45532, None) 20/01/08 21:20:12 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/data/tools/xsql-0.6.0/xsql-0.6.1-bin-spark-2.4.3/bin/spark-warehouse'). 20/01/08 21:20:12 INFO SharedState: Warehouse path is 'file:/data/tools/xsql-0.6.0/xsql-0.6.1-bin-spark-2.4.3/bin/spark-warehouse'. 20/01/08 21:20:13 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint 20/01/08 21:20:13 INFO XSQLExternalCatalog: reading xsql configuration from /data/tools/xsql-0.6.0/xsql-0.6.1-bin-spark-2.4.3/conf/xsql.conf 20/01/08 21:20:13 INFO XSQLExternalCatalog: parse data source default Exception in thread "main" java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.xsql.XSQLExternalCatalog': at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:223) at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:104) at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:103) at org.apache.spark.sql.xsql.XSQLSessionStateBuilder.externalCatalog(XSQLSessionStateBuilder.scala:60) at org.apache.spark.sql.xsql.XSQLSessionStateBuilder.catalog$lzycompute(XSQLSessionStateBuilder.scala:73) at org.apache.spark.sql.xsql.XSQLSessionStateBuilder.catalog(XSQLSessionStateBuilder.scala:71) at org.apache.spark.sql.xsql.XSQLSessionStateBuilder.catalog(XSQLSessionStateBuilder.scala:57) at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$1.apply(BaseSessionStateBuilder.scala:291) at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$1.apply(BaseSessionStateBuilder.scala:291) at org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:77) at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:77) at org.apache.spark.sql.xsql.ResolveScanSingleTable.(XSQLStrategies.scala:69) at org.apache.spark.sql.xsql.shell.SparkXSQLShell$.main(SparkXSQLShell.scala:55) at org.apache.spark.sql.xsql.shell.SparkXSQLShell.main(SparkXSQLShell.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet successfully received from the server was 305 milliseconds ago. The last packet sent successfully to the server was 296 milliseconds ago. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:990) at com.mysql.jdbc.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:201) at com.mysql.jdbc.MysqlIO.negotiateSSLConnection(MysqlIO.java:4912) at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1663) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1224) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2190) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2221) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2016) at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:776) at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:386) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:330) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:247) at org.apache.spark.sql.xsql.manager.MysqlManager.getConnect(MysqlManager.scala:75) at org.apache.spark.sql.xsql.manager.MysqlManager.cacheDatabase(MysqlManager.scala:152) at org.apache.spark.sql.xsql.DataSourceManager$class.parse(DataSourceManager.scala:202) at org.apache.spark.sql.xsql.manager.MysqlManager.parse(MysqlManager.scala:51) at org.apache.spark.sql.xsql.XSQLExternalCatalog.addDataSource(XSQLExternalCatalog.scala:439) at org.apache.spark.sql.xsql.XSQLExternalCatalog$$anonfun$setupAndInitMetadata$4.apply(XSQLExternalCatalog.scala:172) at org.apache.spark.sql.xsql.XSQLExternalCatalog$$anonfun$setupAndInitMetadata$4.apply(XSQLExternalCatalog.scala:162) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at org.apache.spark.sql.xsql.XSQLExternalCatalog.setupAndInitMetadata(XSQLExternalCatalog.scala:162) at org.apache.spark.sql.xsql.XSQLExternalCatalog.(XSQLExternalCatalog.scala:119) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:214) ... 25 more Caused by: javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: java.security.cert.CertPathValidatorException: Path does not chain with any of the trust anchors at sun.security.ssl.Alerts.getSSLException(Alerts.java:192) at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1964) at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:328) at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:322) at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1614) at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216) at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1052) at sun.security.ssl.Handshaker.process_record(Handshaker.java:987) at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1072) at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1385) at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1413) at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1397) at com.mysql.jdbc.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:186) ... 58 more Caused by: java.security.cert.CertificateException: java.security.cert.CertPathValidatorException: Path does not chain with any of the trust anchors at com.mysql.jdbc.ExportControlled$X509TrustManagerWrapper.checkServerTrusted(ExportControlled.java:302) at sun.security.ssl.AbstractTrustManagerWrapper.checkServerTrusted(SSLContextImpl.java:992) at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1596) ... 66 more Caused by: java.security.cert.CertPathValidatorException: Path does not chain with any of the trust anchors at sun.security.provider.certpath.PKIXCertPathValidator.validate(PKIXCertPathValidator.java:154) at sun.security.provider.certpath.PKIXCertPathValidator.engineValidate(PKIXCertPathValidator.java:80) at java.security.cert.CertPathValidator.validate(CertPathValidator.java:292) at com.mysql.jdbc.ExportControlled$X509TrustManagerWrapper.checkServerTrusted(ExportControlled.java:295) ... 68 more 20/01/08 21:20:13 INFO SparkContext: Invoking stop() from shutdown hook 20/01/08 21:20:13 INFO SparkUI: Stopped Spark web UI at http://hdp-client-02:4040 20/01/08 21:20:13 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 20/01/08 21:20:13 INFO MemoryStore: MemoryStore cleared 20/01/08 21:20:13 INFO BlockManager: BlockManager stopped 20/01/08 21:20:13 INFO BlockManagerMaster: BlockManagerMaster stopped 20/01/08 21:20:14 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 20/01/08 21:20:14 INFO SparkContext: Successfully stopped SparkContext 20/01/08 21:20:14 INFO ShutdownHookManager: Shutdown hook called 20/01/08 21:20:14 INFO ShutdownHookManager: Deleting directory /tmp/spark-7f618cfa-d370-4a88-a6c5-8effc8f275e2 20/01/08 21:20:14 INFO ShutdownHookManager: Deleting directory /tmp/spark-a7925d7e-4b42-444d-bcc0-be1019ea63c7

beliefer commented 4 years ago

Could you check if the MySQL server is OK ?

yangbo3 commented 4 years ago

mysql server is ok, mysql-client can connect to this server

beliefer commented 4 years ago

Please use useSSL=false

yangbo3 commented 4 years ago

Great!thank you