dariemcarlosdev / VPKFILEPROCESSORAPP

Blazor Server app that takes a CSV file as input, runs a background job to execute an SSIS package in Azure, and stores the output CSV file in Azure Blob Storage for download.
GNU Affero General Public License v3.0
1 stars 0 forks source link

Fix error executing Data flow.The pipeline parameters are not being passed correctly to the pipeline. #16

Closed dariemcarlosdev closed 1 day ago

dariemcarlosdev commented 1 day ago

Azure Data Factory was treating the contents of inputFileName or outputFileName parameters passed by Azure Function #3 as numeric values instead of strings, possibly due to an implicit conversion or misinterpretation. And error of type: 'i' is an invalid end of a number. Expected a delimiter was throwing by Azure Func while logging output. This was causing the ADF Pipeline was receiving empty parameters values.

To address this,I focused on ensuring that the values are interpreted strictly as strings in the Azure Data Factory pipeline. Here are targeted steps and adjustments that could help:

1. Explicitly Confirm Data Types in Azure Data Factory

Double-check the parameter definitions in your Data Factory pipeline: Ensure that both inputFileName and outputFileName parameters are explicitly set as String data types in your pipeline activity or data flow. Any other type might cause Data Factory to attempt parsing it as JSON or a number.

2. Confirm BinaryData Usage

Since BinaryData can sometimes be handled in unexpected ways, you can attempt to simplify by using a plain string dictionary to see if that resolves the issue:

var testParameters = new Dictionary<string, string> { { "inputFileName", "20241103205428inputinput.csv" }, { "outputFileName", "0241103205428inputinputout.csv" } };

// Attempting to serialize to BinaryData as a JSON object if necessary var binaryDataParams = new Dictionary<string, BinaryData>(); foreach (var kvp in testParameters) { binaryDataParams[kvp.Key] = BinaryData.FromString(kvp.Value); } var runResponse = await dataFactoryPipelineResource.CreateRunAsync(binaryDataParams);

This approach may ensure that the values remain strings throughout the process.

3. Workaround: Wrapping Strings as JSON Literal

If CreateRunAsync still misinterprets, you could try adding an extra layer of JSON encoding:

var jsonEncodedParameters = new Dictionary<string, BinaryData> { { "inputFileName", BinaryData.FromString(JsonSerializer.Serialize("20241103205428inputinput.csv")) }, { "outputFileName", BinaryData.FromString(JsonSerializer.Serialize("0241103205428inputinputout.csv")) } };

By wrapping each parameter value in JsonSerializer.Serialize, this can ensure each string parameter is treated as a JSON string literal.

4. Run a Minimal Test with Hardcoded Values

Run the pipeline with directly hardcoded string literals in the Azure portal to confirm if the parameters work without programmatic issues. If they work in the portal but not via code, then it indicates a serialization issue with BinaryData.