Open deepuak opened 1 year ago
Add to the above, i am facing some issues currently when using brickflow to deploy from my windows machine.
1. Deploy: i am using "brickflow projects deploy --project hello-world-brickflow -e local" and got below error:
Starting upload of bundle files Uploaded bundle files at /Users/drg.devops@clarivate.com/.brickflow_bundles/hello-world-brickflow/local/files! Starting resource deployment Error: terraform apply: exit status 1 Error: cannot create pipeline: storage path must be absolute with databricks_pipeline.test_hello_world, on bundle.tf.json line 349, in resource.databricks_pipeline.test_hello_world: 349: } Error: Command '['.databricks/bin/cli\0.203.0\databricks', 'bundle', 'deploy', '-e', 'hello-world-brickflow-local']' returned non-zero exit status 1.
Please note databricks CLI is configured and cluster_id is mentioned.
2. For now DLT spark_script is just a place holder file and if i need to add DLT module, i see that i am not able to import as part of brickflow. What should be the right approach here?
@deepuak can you please search for "bundle.tf.json" in your local repository and share the line "349". My hunch is that for the DLT task, the path for the notebook is not being resolved properly. Can you help printing your repo structure and the path you gave for the DLT task.
hi @asingamaneni
please find below "bundle.tf.json" line:
"databricks_pipeline": {
"test_hello_world": {
"channel": "current",
"development": true,
"edition": "advanced",
"name": "drg_devops_hello world",
"storage": "123",
"library": [
{
"notebook": {
"path": "/Users/drg.devops@clarivate.com/.brickflow_bundles/hello-world-brickflow/local/files/scripts/spark_script_2"
}
}
]
}
}
Also please find below the repo structure:
Also i am not able to resolve dlt module. I was trying it from brickflow as below. Do i need to install dlt separately ?
@deepuak The below worked for me.
We don't have dlt
in brickflow. Brickflow only supports for deployment of dlt pipelines. Takes the source code as notebooks and deploy them.
The below is sample code for dlt-pipeline in my local:
The below is sample code in my workflow:
In the DLTPipeline configuration, please provide target
instead of storage
@deepuak is the issue resolved?
Hi @asingamaneni i will check.
Is your feature request related to a problem? Please describe. I would like to invoke Delta Live Table from brickflow
Cloud Information
Describe the solution you'd like Curently, DLT is deployed in databricks as a wheel file. I would like to deploy the same DLT wheel file using brickflow
Describe alternatives you've considered
Additional context