vonkonyoung / myspark

0 stars 1 forks source link

metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException #6

Open vonkonyoung opened 3 years ago

vonkonyoung commented 3 years ago

21/02/09 18:50:04 INFO session.SessionState: Created local directory: /tmp/e60f9be1-1a82-4a77-856e-8a44b6d71476_resources 21/02/09 18:50:04 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/e60f9be1-1a82-4a77-856e-8a44b6d71476 21/02/09 18:50:04 INFO session.SessionState: Created local directory: /tmp/root/e60f9be1-1a82-4a77-856e-8a44b6d71476 21/02/09 18:50:04 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/e60f9be1-1a82-4a77-856e-8a44b6d71476/_tmp_space.db 21/02/09 18:50:04 INFO client.HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is file:/root/bigdata/spark-2.2.2-bin-hadoop2.7/bin/spark-warehouse 21/02/09 18:50:05 INFO metastore.HiveMetaStore: 0: get_database: default 21/02/09 18:50:05 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_database: default
21/02/09 18:50:05 INFO session.SessionState: Created local directory: /tmp/7586ae84-27e4-4aa1-8846-f31e75a236f9_resources 21/02/09 18:50:05 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/7586ae84-27e4-4aa1-8846-f31e75a236f9 21/02/09 18:50:05 INFO session.SessionState: Created local directory: /tmp/root/7586ae84-27e4-4aa1-8846-f31e75a236f9 21/02/09 18:50:05 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/7586ae84-27e4-4aa1-8846-f31e75a236f9/_tmp_space.db 21/02/09 18:50:05 INFO client.HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is file:/root/bigdata/spark-2.2.2-bin-hadoop2.7/bin/spark-warehouse 21/02/09 18:50:06 INFO metastore.HiveMetaStore: 0: get_database: global_temp 21/02/09 18:50:06 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_database: global_temp
21/02/09 18:50:06 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException 21/02/09 18:50:06 INFO session.SessionState: Created local directory: /tmp/ef06bb29-db43-415f-99c8-887a87910810_resources 21/02/09 18:50:06 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/ef06bb29-db43-415f-99c8-887a87910810 21/02/09 18:50:06 INFO session.SessionState: Created local directory: /tmp/root/ef06bb29-db43-415f-99c8-887a87910810 21/02/09 18:50:06 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/ef06bb29-db43-415f-99c8-887a87910810/_tmp_space.db 21/02/09 18:50:06 INFO client.HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is file:/root/bigdata/spark-2.2.2-bin-hadoop2.7/bin/spark-warehouse 21/02/09 18:50:06 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint spark-sql>

vonkonyoung commented 3 years ago

这里默认使用的应该还是derby数据库,所以没有连接元数据库获取元数据信息

vonkonyoung commented 3 years ago

为什么在程序里面加了这一句就好了呢: .config("hive.metastore.uris", "thrift://hadoop-master:9083") image 可以看到,如果没有上面的配置文件的话,运行程序的目录下会生成一个derby.log的文件,打开内容如下:

Wed Feb 10 09:54:16 CST 2021: Booting Derby version The Apache Software Foundation - Apache Derby - 10.14.2.0 - (1828579): instance a816c00e-0177-89a5-6b7f-000007641908 on database directory D:\working\code\myspark\metastore_db with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@2fc2a205 Loaded from file:/D:/working/repository/org/apache/derby/derby/10.14.2.0/derby-10.14.2.0.jar java.vendor=Oracle Corporation java.runtime.version=1.8.0_211-b12 user.dir=D:\working\code\myspark os.name=Windows 10 os.arch=amd64 os.version=10.0 derby.system.home=null Database Class Loader started - derby.database.classpath='' 从中可以看到,有一个metastore_db 在项目的统计目录: image 原来一直使用的是derby元数据,怪不得我怎么新建表都识别不了。 那怎么解决这种问题呢? 第一种方法是在程序里面加上刚才的config代码。 第二种方法是在hive-site.xml文件里面加上如下配置:

hive.metastore.local false hive.metastore.uris thrift://hadoop-master:9083
vonkonyoung commented 3 years ago

![Uploading image.png…]()

PeterLiuY commented 1 year ago

加了之后也访问不了

vonkonyoung commented 1 year ago

你好,你的邮件已收到,谢谢你的合作!