Added Scala Notebook that migrates Snowflake schema to Synapse dedicated pool.
ADF , as well as Synapse pipeline requires multiple step to get data from Snowflake table to the table in SQL dedicated pool in Synapse. You need first copy data to BLOB storage and then you load the file from BLOB storage into Synapse table.
This notebook is using simple approach reading data from Snowflake table for given schema into Spark Data Frame and then write this Data Frame into Synapse SQL dedicated pool table. It uses schema name as parameter, providing ability to move all tables from given schema to SQL dedicated pool schema. The stored procedure that initiate database objects provided in the comments.
hi @mlevin19 thanks for sending the PR, here are some suggestions to improve it.
cell 1. Please add the link to download spark-snowflake_2.12-2.9.0-spark_3.1.jar and snowflake-jdbc-3.13.6.jar .
cell 1. Please add the link to the instruction how to add customized jars to cluster/session packages
cell 2, Please change the schema value to parameter that user need to specify .e.g. val sfschema = ""
cell 3, Please add a link to instruction of configuring Azure Key Vault.
cell 3, Please change the get secret input as parameters user need to specify. e.g. mssparkutils.credentials.getSecret("azure key vault name","secret name","linked service name")
Added Scala Notebook that migrates Snowflake schema to Synapse dedicated pool. ADF , as well as Synapse pipeline requires multiple step to get data from Snowflake table to the table in SQL dedicated pool in Synapse. You need first copy data to BLOB storage and then you load the file from BLOB storage into Synapse table. This notebook is using simple approach reading data from Snowflake table for given schema into Spark Data Frame and then write this Data Frame into Synapse SQL dedicated pool table. It uses schema name as parameter, providing ability to move all tables from given schema to SQL dedicated pool schema. The stored procedure that initiate database objects provided in the comments.