aws / amazon-redshift-jdbc-driver

Redshift JDBC Driver. It supports JDBC 4.2 specification.
Apache License 2.0
61 stars 30 forks source link

Connecting to an external schema that is named with spaces will not list the tables inside it #106

Closed paoliniluis closed 7 months ago

paoliniluis commented 8 months ago

Driver version

Tested with the latest driver

Redshift version

PostgreSQL 8.0.2 on i686-pc-linux-gnu, compiled by GCC gcc (GCC) 3.4.2 20041017 (Red Hat 3.4.2-6.fc3), Redshift 1.0.62312

Client Operating System

Using Metabase and Datagrip

JAVA/JVM version

11

Table schema

CREATE SCHEMA "a schema with spaces"

Problem description

1) Create a schema with spaces on the name on RDS 2) Create an external schema in Redshift, that connects to the schema in RDS 3) you will see the schema, but not the tables inside it

JDBC trace logs

NA

Reproduction code

NA

bhvkshah commented 8 months ago

@paoliniluis thanks for opening the issue. I'll take a look and get back to you once I have an update.

bhvkshah commented 8 months ago

the title says it lists the tables but the description says it does not - can you please clarify?

paoliniluis commented 8 months ago

@bhvkshah my bad, it will NOT list the tables inside the schema that was named with spaces in RDS

bhvkshah commented 8 months ago

Gotcha. Are you setting the JDBC connection property databasemetadatacurrentdbonly=false?

Also, can you please send over DEBUG level driver logs? Instructions on how to configure logging can be found here: https://docs.aws.amazon.com/redshift/latest/mgmt/jdbc20-configuration-options.html

paoliniluis commented 8 months ago

yup, tested that.

If you see the picture, the federated_sample3 schema in Redshift connects to the "a third schema but with spaces schema" in RDS image

bhvkshah commented 8 months ago

Looks like the tables are visible. Is the problem resolved?

bhvkshah commented 7 months ago

@paoliniluis following up, is the issue resolved?

paoliniluis commented 7 months ago

hi @bhvkshah, thanks for taking the time. I deleted the cluster I had with that specific use case. You can close it if needed

bhvkshah commented 7 months ago

Glad the issue was resolved, @paoliniluis . Please feel free to reach out if you have any further issues in the future!