Closed Limp-monkey closed 5 years ago
@hvermis @zhangyd2015 would you mind taking a look at this issue?
This is because SqlServerSource is not released in Powershell SDK yet.
As a workaround you can use New-AzResource instead ADF resource creation cmdlets, or ARM template deployment cmdlets. Here's a sample for New-AzResource: $dataset = Get-AzResource -ResourceType "Microsoft.DataFactory/factories/datasets" -ResourceGroupName ans-data-factory -Name "ans-prod-v2/DelimitedText1" -ApiVersion "2018-06-01"
$newsonPropertiesStringFromFile = ConvertTo-Json $dataset.Properties
New-AzResource -ResourceType "Microsoft.DataFactory/factories/datasets" -ResourceGroupName ans-data-factory -Name "ans-prod-v2/DelimitedText1" -ApiVersion "2018-06-01" -Properties (ConvertFrom-Json $newsonPropertiesStringFromFile)
Doc links: https://docs.microsoft.com/en-us/powershell/module/az.resources/new-azresource?view=azps-2.2.0 https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-template-deploy
thanks, I think I got it to work for the pipelines using those datasets too. Any chance for a timeline on SqlServerSource?
Also, this brings me to another question - we're trying to use powershell to automate as much provisioning as possible, but seems we're always running into issues with versions, like:
Do you have any recommendation, which API to use as the most safe one? Is it the New-AzResource?
Get-Content $pipeline_list | ForEach-Object {
$pipeline = $_
$pipeline_json = Get-Content "$path_json\pipeline\$pipeline.json" | ConvertFrom-Json
New-AzResource -ResourceType "Microsoft.DataFactory/factories/pipelines" -ResourceGroupName $env:PIPELINE_TARGET_RESOURCE_GROUP_NAME -Name "$env:PIPELINE_TARGET_DATA_FACTORY_NAME/$pipeline" -ApiVersion "2018-06-01" -Properties $pipeline_json -IsFullObject -Force
}
We are working on releasing the new types, they will either be in upcoming release or next one. AzureRM does not get updates anymore, only Az modules do. You should contact DevOps team to get you VM-s with Az modules. There is always a chance that Powershell module will be behind in features from our backend as we first develop the backend. The generic New-AzResource does not have that problem, so you could use that, however our recommended way to do CI/CD is using git integration: https://docs.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment#use-custom-parameters-with-the-resource-manager-template
thanks again! We've looked into that, but we gave up due to this unsupported feature: "You can't publish individual resources, because data factory entities depend on each other. " On development we sometimes need to be able to schedule trigger a pipeline, and the only way to do that is when a pipeline has been published. However, that doesn't means it' ready to be deployed, so we can't have it exported via git integration. Hence we needed a way of publishing particular pipelines - the best way we found was DevOps Pipelines reading from GIT master branch. Anyhow, thank you for the support and tips!
Description
The issue starts with the fact that it’s possible to create a dataset in Azure Data Factory, which is a SQL server MI dataset, which doesn’t point to any table. This is possible to do in the portal, but not with PowerShell scripts, and I believe it’s a bug as indicated here: https://github.com/Azure/azure-powershell/issues/9261
However, when creating a pipeline in a new ADF using a pipeline with reference to such a dataset, it will allow to create the pipeline, but with an invalid JSON . Even if the dataset is corrected in the target dataset. So if we have ADF_1 • Dataset_A with sql server managed instance dataset without any table • Pipeline_B using an activity lookup to dataset_A
When running New-AzDataFactoryV2Dataset on dataset_A to copy ADF_2 based on JSON from ADF_1, it will fail as indicated in the bug above. However, if I create it manually in ADF_2 as it is in ADF_1 and then run New-AzDataFactoryV2Pipeline to move Pipeline_B to ADF_2 it will succeed, but the pipeline_B JSON in ADF_2 will be corrupt. The reason why it will be corrupt is that in the lookup will change type SqlServerSource to type CopySource. If I manually change that value to SqlServerSource, it will validate.
Steps to reproduce
Environment data
Module versions
Debug output
Error output