Closed TPreece101 closed 1 year ago
I have realised my mistake here - I thought the library was picking up the partitions from S3 but it looks like they're generated based on date ranges.
I've also realised that for my use case since I'm unloading a query to S3 and then creating the table I can actually get the partition values from the query itself so I'll give that a try.
Describe the bug
I have managed to set up some unpartitioned sources just fine, however when I try to set up a partitioned source I get the following error:
It looks like it's failing here: https://github.com/dbt-labs/dbt-external-tables/blob/main/macros/plugins/redshift/refresh_external_table.sql#L17
What kind of object is it meant to be if not a dictionary? It looks like it just parses the
partitions
key in my__sources.yml
.I'm also interested as I'm trying to write some custom materialisations for external tables (which I'm thinking of contributing once they're a bit more battle tested) how does it find the partition paths and values? I've not found a way in Redshift to be able to get these.
Steps to reproduce
Sample
__sources.yml
I then run
dbt run-operation stage_external_sources
to get my error message.System information
The contents of your
packages.yml
file:Which database are you using dbt with?
The output of
dbt --version
:The operating system you're using: Debian GNU/Linux 10 (buster)
The output of
python --version
: Python 3.8.12Let me know if you need any more information, I'm looking forward to getting to grips with this library better 😊