Closed chris-okorodudu closed 1 month ago
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval.
@chris-okorodudu You can add bindings as a keyword argument within self._hook.execute_query
.
Here is an example:
"bindings": {
"1": {
"type": "FIXED",
"value": "123"
}
}
or more details on the correct format, please refer to the following article: sql-api-bind-variables Please be mindful that Snowflake does not currently support bindings in multi-statement SQL requests.
While no entirely fixed - seems that this is on the snowflake side and #42719 at least provides an explanation.
Apache Airflow version
2.9.3
If "Other Airflow 2 version" selected, which one?
No response
What happened?
The SnowflakeSqlApiOperator does not resolve parameters in SQL despite accepting this param:
This is due to the fact that it executes by initializing a
SnowflakeSqlApiHook
and then executing the queries without ever passing the parameters:This means that parameters passed in and then referenced how they would be in other Snowflake operators -
%(param)s
- will not be resolved and cause the execution to fail.What you think should happen instead?
The parameters should be resolved either before the sql is passed to the
SnowflakeSqlApiHook
, or as part of theSnowflakeSqlApiHook
.How to reproduce
To reproduce, try passing any parameter and referencing it in your SQL via this syntax
%(param)s
Operating System
all
Versions of Apache Airflow Providers
Tested with multiple versions, most recently