d365collaborative / d365fo.tools

Tools used for Dynamics 365 Finance and Operations
MIT License
246 stars 101 forks source link

Select-PSFObject in Azure AutomationAccount runbook #743

Closed jakobschroeber closed 1 year ago

jakobschroeber commented 1 year ago

Hey,

I have a problem with getting the return objects from Get-D365LcsEnvironmentMetadata and Invoke-D365LcsDatabaseRefresh that I both put in a ps1 script where I use their return objects for logic. The script I wrote basically gets the EnvironmentId for an input parameter EnvironmentName, refreshes the db of a dedicated 2-tier uat env with data from that EnvironmentId, makes an odata action request (d365fo.integrations) to update connection parameters in the updated db and triggers an export of that refreshed db.

PS runtime version is 5.1 d365fo.tools module version is 0.6.79 d365fo.integrations module version is 0.4.38 PSFramework module version is 1.8.289

When I ran the script in my cloud-hosted environment, it worked fine. But when I put the code of my script inside an Azure AutomationAccount runbook of Runbook of type 'PowerShell', the return objects of Get-D365LcsEnvironmentMetadata and Invoke-D365LcsDatabaseRefresh are null, and the job reports the following error messages:

Select-PSFObject : Cannot validate argument on parameter 'Property'. Object reference not set to an instance of an object. At line:16184 char:33 + $res | Select-PSFObject * -TypeName "D365FO.TOOLS.LCS.Environ ... + ~ + CategoryInfo : InvalidData: (:) [Select-PSFObject], ParameterBindingValidationException + FullyQualifiedErrorId : ParameterArgumentValidationError,PSFramework.Commands.SelectPSFObjectCommand

(for Get-D365LcsEnvironmentMetadata)

Select-PSFObject : Cannot validate argument on parameter 'Property'. Object reference not set to an instance of an object. At line:24906 char:36 + ... t-PSFObject *, "OperationActivityId as ActivityId", "EnvironmentId fr ... + ~~~~~~~~~~~~~ + CategoryInfo : InvalidData: (:) [Select-PSFObject], ParameterBindingValidationException + FullyQualifiedErrorId : ParameterArgumentValidationError,PSFramework.Commands.SelectPSFObjectCommand

(for Invoke-D365LcsDatabaseRefresh)

Those messages are unfortunately shortened. I'm rather sure though that they correspond to lines relatively at the of each of the two functions,

$res | Select-PSFObject * -TypeName "D365FO.TOOLS.LCS.Environment.Metadata"

(for Get-D365LcsEnvironmentMetadata)

$refreshJob | Select-PSFObject *, "OperationActivityId as ActivityId", "EnvironmentId from temp as EnvironmentId", "OperationStatus from temp as OperationStatus", "ProjectId from temp as ProjectId" -TypeName "D365FO.TOOLS.LCS.Database.Operation.Status"

(for Invoke-D365LcsDatabaseRefresh)

The verbose output looks exactly like in the cloud-hosted env where it was running fine:

Timestamp            FunctionName                   Line  Level   TargetObject         Message                                                                                                
---------            ------------                   ----  -----   ------------         -------                                                                                                
7/12/2023 4:07:11 PM Get-LcsEnvironmentMetadata     2816  Verbose                      Invoke LCS request.                                                                                    
7/12/2023 4:07:14 PM Get-LcsEnvironmentMetadata     5615  Verbose 00:00:03.4062369     Total time spent inside the function was 00:00:03.4062369                                              
7/12/2023 4:07:14 PM Get-D365LcsEnvironmentMetadata 5615  Verbose 00:00:03.4531532     Total time spent inside the function was 00:00:03.4531532                                              

(for Get-D365LcsEnvironmentMetadata)

On the other hand, when I try Get-D365LcsDatabaseBackups it returns its object just fine, but apparently it doesn't use Select-PSFObject but Select-Object command instead.

I wanted to use the chance to politely ask whether anybody has encountered this before and maybe can suggest a solution to it. Thank you very much in advance

Splaxi commented 1 year ago

Thank you for the very detailed introduction to the issue that you are facing.

I can see that there was a new release of PSFramework, but the changelog doesn't contain anything that would indicate improvements for your issue. I know that the maintainer behind PSFramework is very keen in making things work, so we could ask him if he ever encountered the same error or if we can create a simple sample script that makes it possible for him to reproduce it.

With all that - would you be able to share on a high level what you need to solve? Just a high-level list of steps, and what tools you would utilize.

I have a gut feeling that we could solve all of this directly from inside an Azure DevOps pipelines - as I'm doing for almost every customer project.

I can trigger a database refresh from PROD -> Tier2. Wait for it to complete (less than 6 hours) and then execute an OData endpoint (action), and then start the bacpac export. Just so you know what is already possible from inside Azure DevOps and the pipelines šŸ˜‰

jakobschroeber commented 1 year ago

Thanks for your quick response. The need is to automate the whole chain of steps in order to provide and share a ready-to-use copy of production (or another 2-tier UAT) database for development purposes.

I already achieved doing this using only d365fo.tools and d365fo.integrations functions, SQL Server PowerShell, and a custom Odata action method for the update of connection parameters that I developed myself. The outcome is two scripts:

I was now looking for a way to trigger script 1, and was therefore looking into Azure Automation Account. So my problem is really about putting it into practice and making it work in an everyday scenario. A new bak file should become available at least once per week, if not every day.

My considerations:

  1. Since the refresh process potentially slows down performance in the the source environment of the db refresh, it should be triggered at night.
  2. The refresh should not conflict with the nightly deployments, so it would have to be triggered after deployments are finished.

Thank you for pointing me towards DevOps Pipelines. If I could just run script 1 at the end of a release pipeline and not have these issues with PSFramework, that would solve my problem already.

Btw, if you have a recommendation on a good way to trigger script 2, which uses sqlpackage and SQL Server and will therefore most likely run in a cloud-hosted env, please let me know. Execution of the whole script might take up to 18 hours due to the time-taking bacpac import. I was thinking of running script 2 remotely using run-command with timeoutInSeconds parameter (even though the author recommends VM applications in a repeating scenario), but I'm not sure if they are well-suited for such a long execution time. Maybe you have some experience with that scenario too.

Thanks again

Splaxi commented 1 year ago

Thanks for sharing more details.

I would recommend you to move towards the Azure DevOps and the ecosystem that is there.

Script1 - It will 100% work directly from an Microsoft Hosted (Azure DevOps) agent / pipeline

OR

You will need to run a Self-Hosted agent any ways, to obtain the capabilities in terms of SQL Server, so you can import the bacpac file, data clean the database (if needed), before creating a bak (classic SQL) and upload it to Azure Blob Storage

Script2 - it will 100% work, if you deploy a Tier1 (CHE) via LCS, and afterwards installs the Azure DevOps agent. From here you can get the pipeline to execute from that machine, and gain unlimited tasks execution time - as you're running it yourself.

The above I know from personal experience, that it just works. We are doing the same for several customers as we speak.

So - ditch the Azure Automation Account / Runbook, embrace Azure DevOps and never look back šŸ˜‰

FH-Inway commented 1 year ago

:+1: for the approach with Azure DevOps Pipelines, we are doing the same. I would also recommend to use self-hosted agents due to the long runtime. Depending on your needs, both scripts can be done by the same pipeline (maybe split into two stages) or separate pipelines. The pipeline schedule functionality lets you easily schedule a run during night time (I recommend to use the schedule settings in the Azure DevOps UI and not the cron setting with the pipeline's yaml file, you are much more flexible that way).

Here is a view of the stages of a pipeline in one of my projects that exports the bacpac from a T2-5 environment, converts it to a bak and uploads it to an Azure storage.

image

jakobschroeber commented 1 year ago

Thank you a lot @Splaxi and @FH-Inway. I will continue with using Azure Pipelines and installing an additional self-hosted agent for the environment where the sqlpackage and Sql Server operations are supposed to take place.