Azure-Player / azure.datafactory.tools

Tools for deploying Data Factory (v2) in Microsoft Azure
https://azureplayer.net/adftools
MIT License
212 stars 69 forks source link

Salesforce connection - copy task - "Read behavior" setting parametrized causes "Unable to deserialize the response" #393

Closed casualuser2 closed 3 months ago

casualuser2 commented 6 months ago

When cleanup parameter is set in the deploy and deploy goes into the cleanup step, it executes: Get-AzDataFactoryV2Pipeline -ResourceGroupName "$ResourceGroupName" -DataFactoryName "$FactoryName" | ToArray This fails in case one of the pipelines has a copy task based on a Salesforce connection, where the "Read behavior" setting is parametrized. I think it might be because of an older Az.Datafactory module used in the code, because when I run the Get-AzDataFactoryV2Pipeline from Azure cloud shell it is working.

Debug trace:

STEP: Deleting objects not in source ... Azure Data Factory (instance) loaded. DataSets: 30 object(s) loaded. IntegrationRuntimes: 4 object(s) loaded. LinkedServices: 20 object(s) loaded.

[debug]Agent environment resources - Disk: D:\ Available 12429.00 MB out of 14333.00 MB, Memory: Used 1994.00 MB out of 7167.00 MB, CPU: Usage 2.67%

[debug]Leaving D:\a_tasks\PublishADFTask_1af843b5-35a0-411f-9a18-9eb7a59fb8b8\1.32.1423\PublishADF.ps1.

[debug]Caught exception from task script.

[debug]Error record:

[debug]Get-AzDataFactoryV2Pipeline : Unable to deserialize the response.

[debug]At D:\a_tasks\PublishADFTask_1af843b5-35a0-411f-9a18-9eb7a59fb8b8\1.32.1423\ps_modules\azure.datafactory.tools\public\Get-AdfFromService.ps1:46 char:22

[debug]+ ... Pipelines = Get-AzDataFactoryV2Pipeline -ResourceGroupName "$Resource ...

[debug]+ ~~~~~~~~~~~~~

[debug] + CategoryInfo : CloseError: (:) [Get-AzDataFactoryV2Pipeline], SerializationException

[debug] + FullyQualifiedErrorId : Microsoft.Azure.Commands.DataFactoryV2.GetAzureDataFactoryPipelineCommand

[debug]

[debug]Script stack trace:

[debug]at Get-AdfFromService, D:\a_tasks\PublishADFTask_1af843b5-35a0-411f-9a18-9eb7a59fb8b8\1.32.1423\ps_modules\azure.datafactory.tools\public\Get-AdfFromService.ps1: line 46

[debug]at Publish-AdfV2FromJson, D:\a_tasks\PublishADFTask_1af843b5-35a0-411f-9a18-9eb7a59fb8b8\1.32.1423\ps_modules\azure.datafactory.tools\public\Publish-AdfV2FromJson.ps1: line 274

[debug]at , D:\a_tasks\PublishADFTask_1af843b5-35a0-411f-9a18-9eb7a59fb8b8\1.32.1423\PublishADF.ps1: line 137

[debug]at , : line 1

[debug]at , : line 22

[debug]at , : line 18

[debug]at , : line 1

[debug]Exception:

[debug]Microsoft.Rest.SerializationException: Unable to deserialize the response. ---> Newtonsoft.Json.JsonReaderException: Error reading string. Unexpected token: StartObject. Path 'readBehavior', line 13506, position 33.

[debug] at Newtonsoft.Json.JsonReader.ReadAsString()

[debug] at Newtonsoft.Json.JsonReader.ReadForType(JsonContract contract, Boolean hasConverter)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)

[debug] at Newtonsoft.Json.Linq.JToken.ToObject(Type objectType, JsonSerializer jsonSerializer)

[debug] at Microsoft.Rest.Serialization.PolymorphicDeserializeJsonConverter`1.ReadJson(JsonReader reader, Type objectType, Object existingValue, JsonSerializer serializer)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.DeserializeConvertable(JsonConverter converter, JsonReader reader, Type objectType, Object existingValue)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)

[debug] at Newtonsoft.Json.Linq.JToken.ToObject(Type objectType, JsonSerializer jsonSerializer)

[debug] at Microsoft.Rest.Serialization.PolymorphicDeserializeJsonConverter`1.ReadJson(JsonReader reader, Type objectType, Object existingValue, JsonSerializer serializer)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.DeserializeConvertable(JsonConverter converter, JsonReader reader, Type objectType, Object existingValue)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.PopulateList(IList list, JsonReader reader, JsonArrayContract contract, JsonProperty containerProperty, String id)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateList(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, Object existingValue, String id)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateValueInternal(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)

[debug] at Newtonsoft.Json.Linq.JToken.ToObject(Type objectType, JsonSerializer jsonSerializer)

[debug] at Microsoft.Rest.Serialization.TransformationJsonConverter.ReadJson(JsonReader reader, Type objectType, Object existingValue, JsonSerializer serializer)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.DeserializeConvertable(JsonConverter converter, JsonReader reader, Type objectType, Object existingValue)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.PopulateList(IList list, JsonReader reader, JsonArrayContract contract, JsonProperty containerProperty, String id)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateList(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, Object existingValue, String id)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateValueInternal(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.SetPropertyValue(JsonProperty property, JsonConverter propertyConverter, JsonContainerContract containerContract, JsonProperty containerProperty, JsonReader reader, Object target)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.PopulateObject(Object newObject, JsonReader reader, JsonObjectContract contract, JsonProperty member, String id)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateObject(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateValueInternal(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)

[debug] at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)

[debug] at Newtonsoft.Json.JsonSerializer.DeserializeInternal(JsonReader reader, Type objectType)

[debug] at Microsoft.Rest.Serialization.SafeJsonConvert.DeserializeObject[T](String json, JsonSerializerSettings settings)

[debug] at Microsoft.Azure.Management.DataFactory.PipelinesOperations.d__5.MoveNext()

[debug] --- End of inner exception stack trace ---

[debug] at Microsoft.Azure.Management.DataFactory.PipelinesOperations.d__5.MoveNext()

[debug]--- End of stack trace from previous location where exception was thrown ---

[debug] at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

[debug] at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)

[debug] at Microsoft.Azure.Management.DataFactory.PipelinesOperationsExtensions.d__1.MoveNext()

[debug]--- End of stack trace from previous location where exception was thrown ---

[debug] at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

[debug] at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)

[debug] at Microsoft.Azure.Management.DataFactory.PipelinesOperationsExtensions.ListByFactory(IPipelinesOperations operations, String resourceGroupName, String factoryName)

[debug] at Microsoft.Azure.Commands.DataFactoryV2.DataFactoryClient.ListPipelines(AdfEntityFilterOptions filterOptions)

[debug] at Microsoft.Azure.Commands.DataFactoryV2.DataFactoryClient.FilterPSPipelines(AdfEntityFilterOptions filterOptions)

[debug] at Microsoft.Azure.Commands.DataFactoryV2.GetAzureDataFactoryPipelineCommand.ExecuteCmdlet()

[debug] at Microsoft.WindowsAzure.Commands.Utilities.Common.AzurePSCmdlet.ProcessRecord()

[error]Unable to deserialize the response.

[debug]Processed: ##vso[task.logissue type=error]Unable to deserialize the response.

[debug]Processed: ##vso[task.complete result=Failed]

NowinskiK commented 6 months ago

The problem lies with Get-AzDataFactoryV2Pipeline cmdlet which comes with Az.DataFactory module. Can you run this command with the latest version of that module and check if the issue still exists?

casualuser2 commented 6 months ago

When I run from the Get-AzDataFactoryV2Pipeline Azure cloud shell (which I assume is using latest Az.Datafactory) it is working fine. Probably Azure-Player needs to load latest Az.Datafactory.

NowinskiK commented 6 months ago

I need to replicate the issue first - can you attach the file of the object?

casualuser2 commented 6 months ago

The ADF object? I'm afraid that's difficult given the professional context. But it's pretty quick to set up a test case: 1) create a linked service to a SF environment 2) create a dataset using the SF linked service 3) create a pipeline, add a pipeline parameter, add a copy task with the SF dataset as source and let the "Read behavior" setting reference the pipeline parameter, set the sink to anything (as long as it won't give validation errors). - sample json below 4) save and publish 5) run your deploy process

{ "name": "pipeline1", "properties": { "activities": [ { "name": "Copy data1", "type": "Copy", "dependsOn": [], "policy": { "timeout": "0.12:00:00", "retry": 0, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "typeProperties": { "source": { "type": "SalesforceSource", "readBehavior": { "value": "@pipeline().parameters.readBehavior", "type": "Expression" } }, "sink": { "type": "ParquetSink", "storeSettings": { "type": "AzureBlobFSWriteSettings" }, "formatSettings": { "type": "ParquetWriteSettings" } }, "enableStaging": false, "translator": { "type": "TabularTranslator", "typeConversion": true, "typeConversionSettings": { "allowDataTruncation": true, "treatBooleanAsNumber": false } } }, "inputs": [ { "referenceName": "ds_salesforce", "type": "DatasetReference", "parameters": { "objectApiName": "\"User\"" } } ], "outputs": [ { "referenceName": "ds_parquet", "type": "DatasetReference", "parameters": { "compressionCodec": "snappy" } } ] } ], "parameters": { "readBehavior": { "type": "string" } }, "annotations": [] } }

chalabaj commented 4 months ago

I think this is related to https://github.com/Azure/azure-powershell/issues/24316#issuecomment-2023077171. Error appeared with Az.DataFactory 1.18.2. We resolved it by using 1.18.0. Mostly likely it was solved in newer version but not sure... relase

NowinskiK commented 3 months ago

@chalabaj please test in with the latest version of Az.DataFactory and let me know here. Meanwhile, I'm closing the issue.