Closed rychu151 closed 5 months ago
Hi, have you found the solution yet? i have the same problem when using hive 4.0 with minio:
pyspark.errors.exceptions.captured.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Failed to create external path s3a://wba/warehouse/wba.db for database wba. This may result in access not being allowed if the StorageBasedAuthorizationProvider is enabled: null)
I resigned from using hive-metastore. Nessie has no compatibility issues
Hi @rychu151 , I saw you passed you passed 'fs.s3a.endpoint' as localhost. I think, It shouldn't be localhost because hive is working on different container and MinIO is working on different container. Did you try to set 'http://minio:9000' for fs.s3a.endpoint parameter?
Hi, have you found the solution yet? i have the same problem when using hive 4.0 with minio:
pyspark.errors.exceptions.captured.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Failed to create external path s3a://wba/warehouse/wba.db for database wba. This may result in access not being allowed if the StorageBasedAuthorizationProvider is enabled: null)
I had the same issue and solved by add this config in metastore-site.yml
`
Query engine
Spark
Question
Im trying to setup local develop env for my testing purposes using docker
Target is to save dataframe in a Iceberg format and Hive-metadata
Here is my current docker-compose:
spark-defaults.conf:
and hive-site.xml
using MinIO US i have created a bucket called
warehouse
and set it to public accessTarget is to save dataframe in a Iceberg format and Hive-metadata so i will be able to browse this data using Apache Druid
in order to create a table i use PySpark:
spark.sql("SHOW DATABASES ").show() prints only
default
databasewhen i try to create a database like below:
spark.sql('CREATE DATABASE IF NOT EXISTS hive_prod.testing')
i get the following error:
anyone understands why?