ambika-garg / apache-airflow-microsoft-fabric-plugin

Apache License 2.0
4 stars 3 forks source link

Apache Airflow Plugin for Microsoft Fabric Plugin. 🚀

Introduction

A Python package that helps Data and Analytics engineers trigger run on demand job items of Microsoft Fabric in Apache Airflow DAGs.

Microsoft Fabric is an end-to-end analytics and data platform designed for enterprises that require a unified solution. It encompasses data movement, processing, ingestion, transformation, real-time event routing, and report building. It offers a comprehensive suite of services including Data Engineering, Data Factory, Data Science, Real-Time Analytics, Data Warehouse, and Databases.

How to Use

Install the Plugin

Pypi package: https://pypi.org/project/apache-airflow-microsoft-fabric-plugin/

pip install apache-airflow-microsoft-fabric-plugin

Prerequisities

Before diving in,

Since custom connection forms aren't feasible in Apache Airflow plugins, use can use Generic connection type. Here's what you need to store:

  1. Connection Id: Name of the connection Id
  2. Connection Type: Generic
  3. Login: The Client ID of your service principal.
  4. Password: The refresh token fetched using Microsoft OAuth.
  5. Extra: { "tenantId": "The Tenant Id of your service principal", "clientSecret": "(optional) The Client Secret for your Entra ID App" "scopes": "(optional) Scopes you used to fetch the refresh token" }

    NOTE: Default scopes applied are: https://api.fabric.microsoft.com/Item.Execute.All, https://api.fabric.microsoft.com/Item.ReadWrite.All, offline_access, openid, profile

Operators

FabricRunItemOperator

This operator composes the logic for this plugin. It triggers the Fabric item run and pushes the details in Xcom. It can accept the following parameters:

Features

Sample DAG to use the plugin.

Ready to give it a spin? Check out the sample DAG code below:

from __future__ import annotations

from airflow import DAG
from apache_airflow_microsoft_fabric_plugin.operators.fabric import FabricRunItemOperator
from airflow.utils.dates import days_ago

default_args = {
    "owner": "airflow",
    "start_date": days_ago(1),
}

with DAG(
    dag_id="fabric_items_dag",
    default_args=default_args,
    schedule_interval="@daily",
    catchup=False,
) as dag:

    run_notebook = FabricRunItemOperator(
        task_id="run_fabric_notebook",
        workspace_id="<workspace_id>",
        item_id="<item_id>",
        fabric_conn_id="fabric_conn_id",
        job_type="RunNotebook",
        wait_for_termination=True,
        deferrable=True,
    )

    run_notebook

Feel free to tweak and tailor this DAG to suit your needs!

Contributing

We welcome any contributions: