The UniversalTransferOperator
simplifies how users transfer data from a source to a destination using Apache Airflow. It offers a consistent agnostic interface, improving the users' experience so they do not need to use explicitly specific providers or operators.
At the moment, it supports transferring data between file locations and databases (in both directions) and cross-database transfers.
This project is maintained by Astronomer.
The apache-airflow-provider-transfers is available at PyPI. Use the standard Python installation tools.
To install a cloud-agnostic version of the apache-airflow-provider-transfers, run:
pip install apache-airflow-provider-transfers
You can also install dependencies for using the UniversalTransferOperator
with popular cloud providers:
pip install apache-airflow-provider-transfers[amazon,google,snowflake]
Users can get started quickly with following two approaches:
UniversalTransferOperator
using astroCreate an Astro project.
mkdir <your-astro-project-name>
cd <your-astro-project-name>
Run the following Astro CLI command to initialize an Astro project in the directory:
astro dev init
This command generates the following files in the directory:
. ├── .env # Local environment variables ├── dags # Where your DAGs go │ ├── example-dag-basic.py # Example DAG that showcases a simple ETL data pipeline │ └── example-dag-advanced.py # Example DAG that showcases more advanced Airflow features, such as the TaskFlow API ├── Dockerfile # For the Astro Runtime Docker image, environment variables, and overrides ├── include # For any other files you'd like to include ├── plugins # For any custom or community Airflow plugins │ └── example-plugin.py ├── tests # For any DAG unit test files to be run with pytest │ └── test_dag_integrity.py # Test that checks for basic errors in your DAGs ├── airflow_settings.yaml # For your Airflow connections, variables and pools (local only) ├── packages.txt # For OS-level packages └── requirements.txt # For Python packages (add apache-airflow-provider-transfers here)
Add the following in requirements.txt
apache-airflow-provider-transfers[all]
dags
directory of your Airflow project:Alternatively, you can download example_transfer_and_return_files.py
and example_snowflake_transfers.py
.
curl -O https://github.com/astronomer/apache-airflow-provider-transfers/blob/main/example_dags/example_transfer_and_return_files.py
curl -O https://github.com/astronomer/apache-airflow-provider-transfers/blob/main/example_dags/example_snowflake_transfers.py
astro dev start
Cloud Provider | Connection name | Documentation link |
---|---|---|
Amazon | aws_default | aws connection |
google_cloud_default | google cloud connection | |
Snowflake | snowflake_conn | snowflake connection |
UniversalTransferOperator
using vanilla airflow and pythonInstall airflow and setup project following this documentation.
Ensure that your Airflow environment is set up correctly by running the following commands:
export AIRFLOW_HOME=`pwd`
airflow db init
Add the following in requirements.txt
apache-airflow-provider-transfers[all]
Copy file named example_transfer_and_return_files.py and example_snowflake_transfers.py and add it to the dags
directory of your Airflow project:
Alternatively, you can download example_transfer_and_return_files.py
and example_snowflake_transfers.py
.
curl -O https://github.com/astronomer/apache-airflow-provider-transfers/blob/main/example_dags/example_transfer_and_return_files.py
curl -O https://github.com/astronomer/apache-airflow-provider-transfers/blob/main/example_dags/example_snowflake_transfers.py
Create the environment variable for AWS bucket, Snowflake and google cloud bucket for the transfer as per the following:
Run your project in a local Airflow environment.
After your project builds successfully, open the Airflow UI in your web browser at https://localhost:8080/. Find your DAGs in the dags directory in the Airflow UI.
Create airflow connection for snowflake, google and amazon using airflow UI as per documentation below.
Cloud Provider | Connection name | Documentation link |
---|---|---|
Amazon | aws_default | aws connection |
google_cloud_default | google cloud connection | |
Snowflake | snowflake_conn | snowflake connection |
Trigger the DAGs and validate the transfers.
Checkout the example_dags folder for examples of how the UniversalTransferOperator
can be used.
With UniversalTransferOperator
, users can perform data transfers using the following transfer modes:
Non-native transfers rely on transferring the data through the Airflow worker node. Chunking is applied where possible. This method can be suitable for datasets smaller than 2GB, depending on the source and target. The performance of this method is highly dependent upon the worker's memory, disk, processor and network configuration.
Internally, the steps involved are:
Following is an example of non-native transfers between Google cloud storage and Sqlite:
An alternative to using the Non-native transfer method is the native method. The native transfers rely on mechanisms and tools offered by the data source or data target providers. In the case of moving from object storage to a Snowflake database, for instance, a native transfer consists in using the built-in COPY INTO
command. When loading data from S3 to BigQuery, the Universal Transfer Operator uses the GCP Storage Transfer Service.
The benefit of native transfers is that they will likely perform better for larger datasets (2 GB) and do not rely on the Airflow worker node hardware configuration. With this approach, the Airflow worker nodes are used as orchestrators and do not perform the transfer. The speed depends exclusively on the service being used and the bandwidth between the source and destination.
Steps:
NOTE: The Native method implementation is in progress and will be available in future releases.
The UniversalTransferOperator
can also offer an interface to generic third-party services that transfer data, similar to Fivetran.
Here is an example of how to use Fivetran for transfers:
Databases supported:
File store supported:
The documentation is a work in progress -- we aim to follow the Diátaxis system.
Reference guide: Commands, modules, classes and methods
Getting Started Tutorial: A hands-on introduction to the Universal Transfer Operator
The Universal Transfer Operator follows semantic versioning for releases. Check the changelog for the latest changes.
See Managing Releases to learn more about our release philosophy and steps.
All contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome.
Read the Contribution Guideline for a detailed overview of how to contribute.
Contributors and maintainers should abide by the Contributor Code of Conduct.