Azure-Player / azure.datafactory.tools

Tools for deploying Data Factory (v2) in Microsoft Azure
https://azureplayer.net/adftools
MIT License
207 stars 69 forks source link

ADF pipelines cannot be executed due to big size of global parameter: adftools_deployment_state #377

Closed MartaUch closed 8 months ago

MartaUch commented 8 months ago

Hello,

Today we enabled "Incremental Deployment" feature and we were not able to run our pipelines due to big size of adftools_deployment_state global parameter. When we removed that, all the pipelines have run successfully. image

Value of this global parameter is not used anywhere. Are there any work arounds for such a big adf instances? Are there any limitations we should keep in mind?

It's confusing, because I was able to recreate and publish such a global parameter even manually and there was no issue. I created a ticket for Microsoft already, but just wanted to raise it here because maybe someone else can run into the similar issue.

Thank you for your support!

MartaUch commented 8 months ago

Hi Kamil, Just for your information, I've implemented workaround for this issue, temporarily until your solution will allow to keep the value for global parameter somewhere else.

I've created small table in SQL database and save/update value for this global parameter there. When we run CD first we read this value and deploy to ADF, then we run Publish task that comes from your solution and after all we update value in SQL table and remove from ADF. It's not the perfect solution, but works for now. I've tried with storage account, but we allow access only for particular IP addresses and my pipeline runs on different IP every time, thus the fastest solution for me was to use SQL table where credentials works already.

Looking forward for the possibility in your solution to save this global parameter in a different location :)

NowinskiK commented 8 months ago

Thanks Marta. Closing this as duplicated of #374