trinodb / trino

Official repository of Trino, the distributed SQL query engine for big data, formerly known as PrestoSQL (https://trino.io)
https://trino.io
Apache License 2.0
10.49k stars 3.02k forks source link

how to to configure multiple S3 storage accounts in single catalog #24087

Closed luhea closed 1 week ago

luhea commented 1 week ago

Trino:464 Hive Metastore:3.1.3 I want to use hive metastore read data from multiple S3 endpoint in single catalog, in new version can config hive.s3.security-mapping.config-file in hive catalog.

when i config: Image

select endpiint1 table1 is ok

but select endpoint2 table2.error: Query 20241109_094613_00001_45w7b failed: null (Service: Amazon S3; Status Code: 403; Error Code: InvalidAccessKeyId; Request ID: tx00000bde4e6dd0650a2da-00672f2f66-8265-test_bd_zone; S3 Extended Request ID: 8265-test_bd_zone-test_bd_zonegroup; Proxy: null) com.amazonaws.services.s3.model.AmazonS3Exception: null (Service: Amazon S3; Status Code: 403; Error Code: InvalidAccessKeyId; Request ID: tx00000bde4e6dd0650a2da-00672f2f66-8265-test_bd_zone; S3 Extended Request ID: 8265-test_bd_zone-test_bd_zonegroup; Proxy: null)

table 1 ddl: CREATE TABLE hivetest.cephk8stest.s3_test (
_hoodie_commit_time varchar,
_hoodie_partition_path varchar,
_hoodie_record_key varchar,
_hoodie_commit_seqno varchar,
_hoodie_file_name varchar WITH (
external_location = 's3a://flinktest/table1/', format = 'PARQUET'
)

table2 ddl: CREATE TABLE cephk8stest.s3_test_tb2( _hoodie_commit_time VARCHAR, _hoodie_partition_path VARCHAR, _hoodie_record_key VARCHAR, _hoodie_commit_seqno VARCHAR, _hoodie_file_name VARCHAR
)

WITH ( format = 'PARQUET', external_location = 's3a://kafka2hudi-ceph-s3-test/table2/' )