Closed ArunT317 closed 1 year ago
Hi @ArunT317 ! The dbt deps
command is not supported by dbt Cloud. Or rather, it's always included in the job steps and dbt Cloud doesn't allow you to add it there explicitly. Can you try if dbt-cloud job run --steps-override '["dbt compile"]'
works?
I also made some fixes to the --steps-override argument parsing in the latest 0.7.3 release. Remember to update your dbt-cloud-cli package using pip install --upgrade dbt-cloud-cli
Hi @ArunT317 ! The
dbt deps
command is not supported by dbt Cloud. Or rather, it's always included in the job steps and dbt Cloud doesn't allow you to add it there explicitly. Can you try ifdbt-cloud job run --steps-override '["dbt compile"]'
works?
Hi @stumelius - I am able to pass one statement in the --steps-override argument which is working as expected. But my requirement is to pass two statements in the --steps-override argument ["dbt run-operation", "dbt run -m sample]. This is working as expected through python call to the API but is failing when called through dbt-cloud cli job run command.
I also made some fixes to the --steps-override argument parsing in the latest 0.7.3 release. Remember to update your dbt-cloud-cli package using
pip install --upgrade dbt-cloud-cli
Thanks for the update I will try and will keep you posted.
@stumelius - Still No Luck, Below Command is failing.
dbt-cloud job run --api-token $token --account-id $account --job-id $jobID --steps-override '["dbt compile", "dbt ls --models source:test+"]' --wait;
@ArunT317 Can you try running the command in DEBUG mode?
LOG_LEVEL=DEBUG dbt-cloud job run --api-token $token --account-id $account --job-id $jobID --steps-override '["dbt compile", "dbt ls --models source:test+"]' --wait
@ArunT317 Circling back to this. Did you manage to get this working?
Sorry @stumelius , I did not get a chance to try your log level command. But I did make use of the API directly using python code to achieve the required functionality.
@ArunT317 Got it. I'm closing this ticket now but if you ever get back to this and still run into issues let me know.
I am trying to invoke a dbt job with multiple dbt commands but I am unable to do it. I am facing the below error:
["dbt deps", "dbt compile"] Traceback (most recent call last): File "C:\Users\ArunT\Anaconda3\lib\runpy.py", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\ArunT\Anaconda3\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "C:\Users\ArunT\Anaconda3\Scripts\dbt-cloud.exe__main.py", line 7, in
File "C:\Users\ArunT\Anaconda3\lib\site-packages\click\core.py", line 829, in call__
return self.main(args, kwargs)
File "C:\Users\ArunT\Anaconda3\lib\site-packages\click\core.py", line 782, in main
rv = self.invoke(ctx)
File "C:\Users\ArunT\Anaconda3\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "C:\Users\ArunT\Anaconda3\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "C:\Users\ArunT\Anaconda3\lib\site-packages\click\core.py", line 1066, in invoke
return ctx.invoke(self.callback, ctx.params)
File "C:\Users\ArunT\Anaconda3\lib\site-packages\click\core.py", line 610, in invoke
return callback(args, **kwargs)
File "C:\Users\ArunT\Anaconda3\lib\site-packages\dbt_cloud\cli.py", line 106, in run
run_id = response.json()["data"]["id"]
KeyError: 'id'
I am trying to use Bash Terminal in windows and Python Version 3.8.8.
I have checked the python code and it is expecting python list of string, so i tried all the possible ways to send the array in bash to that list but unable to get that working.
Is this supported ? If yes what is the appropriate way to send multiple dbt commands from Bash to dbt job run command?