Exchangis is a lightweight,highly extensible data exchange platform that supports data transmission between structured and unstructured heterogeneous data sources
Using sqoop to import mysql data into hive can first import mysql data into hdfs, but loading from hdfs to hive will give an error.
It seems that the connection ip and port of hiveserver2 can't be found, but the connection ip and port of Metastore are configured in exchangis. What's the solution to this?
The log is as follows:
使用sqoop将mysql中的数据导入到hive中,能够先将mysql的数据导入到hdfs中,但是从hdfs中load到hive中报错。
好像是找不到hiveserver2的连接ip和端口,但是exchangis中配置的是Metastore的连接ip和端口,这个有什么解决方法?
日志如下:
22/04/08 16:11:12 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE hive_temporary.id_name
( id INT, name STRING, time STRING) COMMENT 'id_name' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
22/04/08 16:11:12 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://mycluster/bd-os/qb_cetc54_fhc/hive/hive_temporary/id_name' OVERWRITE INTO TABLE hive_temporary.id_name
22/04/08 16:11:12 INFO hive.HiveImport: Loading uploaded data into Hive
22/04/08 16:11:12 DEBUG hive.HiveImport: url:jdbc:hive2://null,username:null
22/04/08 16:11:12 INFO jdbc.Utils: Supplied authorities: null
22/04/08 16:11:12 INFO jdbc.Utils: Resolved authority: null:-1
22/04/08 16:11:12 WARN jdbc.HiveConnection: Failed to connect to null:-1
java.sql.SQLException: Some error occurs in url
at dm.jdbc.dbaccess.DBError.throwSQLException(DBError.java:57)
at dm.jdbc.driver.DmDriver_bs.parseUrlBeforeConnect(DmDriver_bs.java:81)
at dm.jdbc.driver.DmDriver.connect(DmDriver.java:103)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.sqoop.hive.HiveImport.getConnection(HiveImport.java:467)
at org.apache.sqoop.hive.HiveImport.getConnectionByRequest(HiveImport.java:461)
at org.apache.sqoop.hive.HiveImport.executeHiveJdbc(HiveImport.java:405)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:242)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:568)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:503)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:800)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
22/04/08 16:11:12 ERROR hive.HiveImport: Error sqlexception
Using sqoop to import mysql data into hive can first import mysql data into hdfs, but loading from hdfs to hive will give an error. It seems that the connection ip and port of hiveserver2 can't be found, but the connection ip and port of Metastore are configured in exchangis. What's the solution to this? The log is as follows: 使用sqoop将mysql中的数据导入到hive中,能够先将mysql的数据导入到hdfs中,但是从hdfs中load到hive中报错。 好像是找不到hiveserver2的连接ip和端口,但是exchangis中配置的是Metastore的连接ip和端口,这个有什么解决方法? 日志如下: 22/04/08 16:11:12 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE
hive_temporary
.id_name
(id
INT,name
STRING,time
STRING) COMMENT 'id_name' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE 22/04/08 16:11:12 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://mycluster/bd-os/qb_cetc54_fhc/hive/hive_temporary/id_name' OVERWRITE INTO TABLEhive_temporary
.id_name
22/04/08 16:11:12 INFO hive.HiveImport: Loading uploaded data into Hive 22/04/08 16:11:12 DEBUG hive.HiveImport: url:jdbc:hive2://null,username:null 22/04/08 16:11:12 INFO jdbc.Utils: Supplied authorities: null 22/04/08 16:11:12 INFO jdbc.Utils: Resolved authority: null:-1 22/04/08 16:11:12 WARN jdbc.HiveConnection: Failed to connect to null:-1 java.sql.SQLException: Some error occurs in url at dm.jdbc.dbaccess.DBError.throwSQLException(DBError.java:57) at dm.jdbc.driver.DmDriver_bs.parseUrlBeforeConnect(DmDriver_bs.java:81) at dm.jdbc.driver.DmDriver.connect(DmDriver.java:103) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:247) at org.apache.sqoop.hive.HiveImport.getConnection(HiveImport.java:467) at org.apache.sqoop.hive.HiveImport.getConnectionByRequest(HiveImport.java:461) at org.apache.sqoop.hive.HiveImport.executeHiveJdbc(HiveImport.java:405) at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:242) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:568) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:503) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:800) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236) 22/04/08 16:11:12 ERROR hive.HiveImport: Error sqlexception