Write code once, deploy to lakehouses everywhere...
Click here to watch the talk on Databricks Asset Bundles at Data & AI Summit 2023.
_The slides from the presentation are available here._
Databricks Asset Bundles, also known simply as bundles, enable you to programmatically validate, deploy, and run the projects you are working on in Databricks via the Databricks CLI. A bundle is a collection of one or more related files that contain:
Any local artifacts (such as source code) to deploy to a remote Databricks workspace prior to running any related Databricks workflows.
The declarations and settings for the Databricks jobs, Delta Live Tables pipelines, or MLOps Stacks that act upon the artifacts that were deployed into the workspace.
For more information on bundles, please see the following pages in Databricks documentation:
In this repo you'll find a simple project consisting of:
These data assets are represented in the bundle.yml
file in the project root directory.
Make sure you have the Databricks CLI installed, then you can use the databricks bundle
commands. You'll also want to edit the bundle.yml
and specify the Databricks Workspace that you plan to deploy to. Once you've got that sorted out, you can deploy and run the project using the following commands:
databricks bundle deploy
databricks bundle run fe_medium_metrics
Please email dabs-preview@databricks.com if you have questions on DABs or if you have questions on the code in this repo, please email rafi.kurlansik@databricks.com