databricks / databricks-asset-bundles-dais2023

Other
48 stars 57 forks source link

Databricks Asset Bundles

Write code once, deploy to lakehouses everywhere...

Click here to watch the talk on Databricks Asset Bundles at Data & AI Summit 2023.

_The slides from the presentation are available here._

Introduction

Databricks Asset Bundles, also known simply as bundles, enable you to programmatically validate, deploy, and run the projects you are working on in Databricks via the Databricks CLI. A bundle is a collection of one or more related files that contain:

For more information on bundles, please see the following pages in Databricks documentation:

Tutorials

Reference

Analyzing Databricks Medium Posts from Field Engineering

In this repo you'll find a simple project consisting of:

  1. A CSV containing URLs of Medium.com blogs written by Field Engineers at Databricks.
  2. A Delta Live Tables pipeline to ingest and process that data, including logic to scrape Medium.com for the number of claps and reading time.
  3. A notebook report that reads the processed data and visualizes it.
  4. A Databricks Workflow with two tasks - the first to refresh the DLT pipeline and the second to execute the notebook report.

These data assets are represented in the bundle.yml file in the project root directory.

Deploying and running this repo

Make sure you have the Databricks CLI installed, then you can use the databricks bundle commands. You'll also want to edit the bundle.yml and specify the Databricks Workspace that you plan to deploy to. Once you've got that sorted out, you can deploy and run the project using the following commands:

databricks bundle deploy
databricks bundle run fe_medium_metrics

Questions?

Please email dabs-preview@databricks.com if you have questions on DABs or if you have questions on the code in this repo, please email rafi.kurlansik@databricks.com