dbeaver / dbeaver

Free universal database tool and SQL client
https://dbeaver.io
Apache License 2.0
39.35k stars 3.4k forks source link

Connect Cloud Store to a third party endpoint url #22803

Open andrescolodrero opened 7 months ago

andrescolodrero commented 7 months ago

Hi, is it possible to configure Dbeaver to use S3 type bucket storage account? We have our own S3 type storage, internal. By example, i can access via command line using and endpoint-url, instead of a AWS region:

aws s3 ls s3://bucket--xxxxxxxxxxxxxxxxxxxxxxxx --endpoint-url=https://s3storagecluster.mydomain.com --profile my-profile

I´d like Dbeaver to connect to that storage account and be able to query duckdb files in there.

LonwoLonwo commented 7 months ago

Hello @andrescolodrero

It looks like DBeaver's cloud explorer can be helpful for you: https://github.com/dbeaver/dbeaver/wiki/Cloud-File-Explorer.

andrescolodrero commented 7 months ago

Hi @LonwoLonwo I saw that, but i cant choice an endpoint url to connect to a S3 "like" storage, as im connecting to AWS. I can only chouce "Regions", i guess, predefined endpoints.

andrescolodrero commented 7 months ago

Hi again, I setup aws in my windows machine and configure the default profile like: [default] ca_bundle = c:/mycert ignore_configure_endpoint_urls = true endpoint_url = https://my_s3_storage s3 = endpoint_url = https://my_s3_storage

I test the setup with "aws s3 ls" and works correctly.

From DBeaver, i setup the latest aws client and i get this error: The AWS Access Key Id you provided does not exist in our records. (Service: S3, Status Code: 403, Request ID: G7HMFXA0Q5ZBD162, Extended Request ID: Xmu1eX769HsFtDr9/fxDEgZ1hUGLQdQ9iVTH0mcSWS79m7YkqQJZBPUD0FBtWT2+TjJ7Lfdxgjk=)

I cant find more logs or how to debug, but i think DBeaver is ignoring the end-point url

LonwoLonwo commented 7 months ago

But how exactly are you trying to add these parameters in DBeaver?

Do you use this dialog? Or?

2024-02-14 19_54_31-Cloud Explorer

andrescolodrero commented 7 months ago

Just like this, using same profile that AWS CLI image

LonwoLonwo commented 7 months ago

Is SSO enabling also not helping?

andrescolodrero commented 7 months ago

Same result. At first time i didnt install aws cli. does DBeaver come with its own aws client? is there some place i can check debug logs? It seems to be ignoring the endpoint-url settings and forcing a region, but im not sure. i guess i could see that in debug logs.

andrescolodrero commented 7 months ago

I found this config in the workspace and some logs: aws-clouds.json [ { "defaultRegions": [], "enabledServices": [ "rds", "redshift", "athena", "dynamodb", "documentdb", "keyspaces" ], "govRegionsEnabled": false, "isoRegionsEnabled": false, "s3Disabled": false, "autoRegistrationEnabled": false, "federatedAccessEnabled": false, "credentials": { "profileName": "default", "defaultAwsCredentials": false, "sessionCredentials": false, "ssoOverCli": false, "crossAccountAccess": false }, "cloudId": "test", "cloudName": "test" } ]

LonwoLonwo commented 6 months ago

You can find instructions here about logs: https://dbeaver.com/docs/wiki/Log-files/.

LonwoLonwo commented 6 months ago

Ok. Probably, for now DBeaver can handle with the embedded databases in the storage. We need to create some kind of file copy and then connect to it. (Something like it we did for SQLite) So, thanks for the feature request.