Closed Altons closed 3 years ago
As I data engineer I would like to be able to pass parameters to a pipeline in bigquery as
dataform run --tags countryReport --vars {country:"UK", department:"marketing"}
which will inject the parameters in my sqlx as
sqlx
select * from ${ref("report_mart")} where country = ${vars.country} and department = ${vars.department}
and compiled version would look like this:
select * from `my_project.my_dataset.report_mart` where country = "UK" and department = "marketing"
This will enable the creation/update of tables reusing the same logic.
I don't love the JSON blob model because bash makes it a bit of a pain to use. I'd suggest --var foo=bar --var baz=qux --vars=foos=bars,bazs=quxs
--var foo=bar --var baz=qux --vars=foos=bars,bazs=quxs
As I data engineer I would like to be able to pass parameters to a pipeline in bigquery as
dataform run --tags countryReport --vars {country:"UK", department:"marketing"}
which will inject the parameters in my
sqlx
asand compiled version would look like this:
This will enable the creation/update of tables reusing the same logic.