Closed sogunsemi closed 5 months ago
The error makes it seem like the problem is happening because you're trying to add a varchar and a bigint, but I don't see where that's happening in the query you posted (i.e., there's no +
operator.) So I'm a bit puzzled-- there's nowhere in the example_stg
model where you're adding something?
The error makes it seem like the problem is happening because you're trying to add a varchar and a bigint, but I don't see where that's happening in the query you posted (i.e., there's no
+
operator.) So I'm a bit puzzled-- there's nowhere in theexample_stg
model where you're adding something?
So in that example table1
is a model that contains the columns col1
and col2
which are both of type bigint
. I'm attempting to combine those 2 columns into a map
. The goal is to create a new column called mapped
of type map(varchar, bigint)
.
Yeah I understand the goal-- I'm confused b/c the error is about the addition operator: '+(VARCHAR, BIGINT)'
and yet I don't see any +
signs in the query you posted that was throwing that error; did you post the whole query, or simplify it somewhat to remove some stuff that would have possibly included a +
sign?
Yeah I understand the goal-- I'm confused b/c the error is about the addition operator:
'+(VARCHAR, BIGINT)'
and yet I don't see any+
signs in the query you posted that was throwing that error; did you post the whole query, or simplify it somewhat to remove some stuff that would have possibly included a+
sign?
Ah right sorry, I hadn't posted the actual query, just an example of it. I thought the error had to do with the map
function but turns out it does not.
The real issue was that in another model that was referenced in example_stg
, a column that should have been a bigint
was created as a string
. This string
column was then added to another bigint
column (col1 + col2
) which caused this error. I didn't understand that's what the error was talking about.
This was a user error, I'll close this, thanks!
I'm trying to replicate a transformation I have in my
dbt-databricks
setup. The data I'm reading is from a parquet file:Here is what I'm trying with
dbt-duckdb
:This gives me an error that seems to be implying I can't create a map of
(VARCHAR, BIGINT)
. I ran the query in the duckDB CLI and it worked just fine. Could there be an issue with how the dbt-duckdb library handles the map type? Here is the error: