Closed mattppal closed 1 month ago
This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please comment on the issue or else it will be closed in 7 days.
Bump
row_format: serde 'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'
file_format: parquet
The above works for me. No need for stored_as
This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please comment on the issue or else it will be closed in 7 days.
Although we are closing this issue as stale, it's not gone forever. Issues can be reopened if there is renewed community interest. Just add a comment to notify the maintainers.
Describe the bug
When defining external tables in Redshift Spectrum stored as parquet, the expected DDL is not returned by
dbt-external-tables
, rendering the external table unreadable.Steps to reproduce
Config:
Expected results
SHOW EXTERNAL TABLE spectrum.abc
Should yield
Since this is what is output when I run:
Actual results
The above command returns:
System information
Which database are you using dbt with?
The output of
dbt --version
:The operating system you're using:
Python 3.9.0
Additional context