d365collaborative / d365fo.tools

Tools used for Dynamics 365 Finance and Operations
MIT License
246 stars 101 forks source link

Question: restoring DB to dev box #481

Closed smholvoet closed 3 years ago

smholvoet commented 3 years ago

What would be the correct sequence of commands in order to achieve the following?

I'm trying to automate the steps above on a freshly deployed development machine, meaning I didn't run Set-D365LcsApiConfig yet.

However, I ran into a couple of issues.

Configure LCS API

Running the command below:

Get-D365LcsApiToken -Username "..." -Password "..." | Set-D365LcsApiConfig

⚠️ Error: Cannot bind argument to parameter 'ClientId' because it is an empty string.

Which means I probably need an Azure app registration with a clientId and a secret? Does the app registration require special API permissions, or do I need to run additional commands?

Switch to EU LCS API, this might clear out the clientId and Bearer I guess?

Set-D365LcsApiConfig -LcsAPiUri "https://lcsapi.eu.lcs.dynamics.com/"

DB restore

List all database backup files:

Get-D365LcsDatabaseBackups

Get a specific DB backup, is there an option to overrule the Latest parameter and point to a specific backup?

Get-D365LcsDatabaseBackups -Latest

Install SqlPackage:

Invoke-D365InstallSqlPackage

Restore the .bak... Is there a d365fo.tools command which covers this?

Sync & start

Get-D365Database
Invoke-D365DbSync -ShowOriginalProgress
Start-D365Environment

Thanks.

Splaxi commented 3 years ago

Hi @smholvoet

It is a bit unclear what you are after, but let me see if I can answer your questions anyway.

Prerequisites for using the tools and working against the LCS API:

As you correctly state, you will need to run Invoke-D365InstallSqlPackage and Invoke-D365InstallAzCopy to help you with all of these things.

For the Registered Application and general high quality information about all of these things, I would recommend that you read the entire blog post from Áriste: https://ariste.info/en/msdyn365-azure-devops-alm but specifically the section about the registered application: https://ariste.info/en/msdyn365-azure-devops-alm/#AAD_app_creation

Note: You will have to set the registered application as Allow Public Client Flows showed in the picture below.

image

With all that in place, we can start talking about what cmdlets can help you do some of the heavy lifting.

As you learned, the Get-D365LcsApiToken requires you to fill in a ClientId (ApplicationId from the Registered Application). So for you to be able to get a valid token for the LCS API, you need 3 things: Username, Password and ClientId.

When that works, you will be using the Set-D365LcsApiConfig to save the details, so you don't need to fill them out every single time you want to invoke the cmdlets. You can call it with single parameters or multiple parameters at the time, it will only override the parameters that you specify. So Set-D365LcsApiConfig -Username "safsdf" will not override any other detail than the username.

Back to flow of operations you want to do, on a high level, I believe this is want you want:

  1. Start data refresh from PROD to Tier2
    • This is done with Invoke-D365LcsDatabaseRefresh
  2. Export bacpac from Tier2
    • This is done with Invoke-D365LcsDatabaseExport
  3. Download the latest bacpac from LCS
    • This is done with a combination of Get-D365LcsDatabaseBackups and Invoke-D365AzCopyTransfer.
    • We actually have a full blown guide on this here on the wiki: https://github.com/d365collaborative/d365fo.tools/wiki/How-To-Download-Latest-Bacpac-From-Lcs
    • If you don't want the latest bacpac file, just skip the -Latest parameter and store the output from Get-D365LcsDatabaseBackups in a variable. Then you can filter the output and get that single bacpac you really need.
  4. Import the bacpac file on a Tier1
    • This is done with Import-D365Bacpac
  5. Run DB sync on the newly imported database, from the bacpac file.
    • This is done with Invoke-D365DbSync. It should be possible to run the dbsync against any database, as long as you specify the database name for the Invoke-D365DbSync cmdlet.
  6. Stop all D365FO related services
    • This is done with Stop-D365Environment
  7. Switch the current AXDB out with the newly imported database
    • This is done with Switch-D365ActiveDatabase. It will the "old" database "AXDB_original" if you don't provide another name.
  8. Start all D365FO related services
    • This is done with Start-D365Environment

One could argue that you just want to switch out the old AXDB as soon as possible, then you would reorder the steps 5,6,7,8 into 6 (Stop) ,7 (Switch),5 (DBSync),8 (Start) instead.

This is the end-2-end flow that needs to happen, if you want to get the latest bacpac file from PROD to a Tier1.

Let me know if you have further questions of I missed something from your requirements.

Next step would be to do a classic SQL backup, which gives you a bak file, which you can then share with all the other Tier1 environments and get a faster restore / import process. That would be solved with the dbatools powershell module, in combination with our Invoke-D365AzCopyTransfer cmdlet.

smholvoet commented 3 years ago

Thx for the detailed reply @Splaxi 👍

User account without MFA

Looks like this is the culprit as it results in:

[11:36:57][Invoke-Authorization] Something went wrong while working against Azure Active Directory (AAD) | The remote server returned an error: (400) Bad Request.
WARNING: [11:36:57][Invoke-Authorization] Stopping because of errors

I'll have to look into creating a separate service account without 2FA and add said account to our LCS project.

Regarding the steps which you've mentioned above, I'm trying to do something similar, but on our cloud-hosted environments (not T1) in case they have just been redeployed and still have the default AxDB. So I'm guessing I'll need:

Splaxi commented 3 years ago

When you say:

Regarding the steps which you've mentioned above, I'm trying to do something similar, but on our cloud-hosted environments (not T1) in case they have just been redeployed and still have the default AxDB. So I'm guessing I'll need:

Are you talking about Tier2?

The only thing that the LCS API will allow you to download is bacpac files. If you really have classic SQL Server bak files, from a SQL Server backup, you have other options.

Run your own Azure Storage Account, where you put all your bak files. Use the Get-D365AzureStorageFile and Invoke-D365AzCopyTransfer to download the file. From there you need the magic from dbatools (I can provide that).

smholvoet commented 3 years ago

I'm not talking about T1/T2 environments but cloud-hosted dev boxes:

image

We have regular backups of our test environment (.bak) which we upload to the LCS library. Uploading them to a dedicated storage account might indeed make things easier when it comes to automating this process.

Could you share a sample of the relevant dbatools commands to restore a .bak which then replaces the current AxDB? This could be really helpful.

Splaxi commented 3 years ago

What threw me off was your statement Tier1.

A Tier1 can be 3 different things.

Stop-D365Environment -All

Restore-DbaDatabase -SqlInstance localhost -DatabaseName "AxDB" -ReplaceDbNameInFile -WithReplace -Path "C:\Temp\AXDB.bak"
Rename-DbaDatabase -SqlInstance localhost -Database "AxDB" -LogicalName "<DBN>_<FT>"
kgmadsen commented 3 years ago

Maybe something like this, feel free the change it:

`#Load Assamblies [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null

$Script:ScriptDir = Split-Path -Parent $PSCommandPath; $SqlFileName = join-path $Script:ScriptDir '_TSqlDbScript.sql'

Set-Location $Script:ScriptDir

$Script:FilePathRestore = ('\YOURPATHTOTHEBAKFILE'); $Script:BakFile = Get-ChildItem -Path $Script:FilePathRestore -Filter '*.bak' | Sort-Object LastAccessTime -Descending | Select-Object -First 1 $Script:SqlDatabaseName = ('YOURSQLDBNAME')

Write-Output "Creating SQL Script..." try { Remove-Item -Path $SqlFileName -Force -Verbose -ErrorAction SilentlyContinue -WarningAction SilentlyContinue New-Item $SqlFileName -ItemType file -Force -Verbose

    # Restore backup from Source server, read the content of the file
    Add-Content $SqlFileName ("{0}" -f (Get-Content -Path $Script:BakFile.FullName))

    Add-Content $SqlFileName ("GO")
    Add-Content $SqlFileName ("USE [{0}]" -f $Script:SqlDatabaseName)
    Add-Content $SqlFileName ("ALTER DATABASE [{0}] SET RECOVERY SIMPLE"  -f $Script:SqlDatabaseName)
    Add-Content $SqlFileName ("GO")
    Add-Content $SqlFileName ("IF NOT EXISTS (SELECT name FROM [sys].[database_principals] WHERE [type] = 'U' AND name = N'YOURSERVICEACCOUNT')")
    Add-Content $SqlFileName ("BEGIN")
    Add-Content $SqlFileName ("    CREATE USER [YOURSERVICEACCOUNT] FOR LOGIN [YOURSERVICEACCOUNT] WITH DEFAULT_SCHEMA=[dbo]")
    Add-Content $SqlFileName ("    ALTER AUTHORIZATION ON SCHEMA::[db_owner] TO [YOURSERVICEACCOUNT]")
    Add-Content $SqlFileName ("    ALTER ROLE [db_owner] ADD MEMBER [YOURSERVICEACCOUNT]")
    Add-Content $SqlFileName ("END")

    try {
    Invoke-sqlcmd -ServerInstance $Script:SqlServerInstance -InputFile $SqlFileName -QueryTimeout 0 -SuppressProviderContextWarning -Verbose -ErrorAction Stop
    }
    catch {
        Write-Output "Sql error ! ->"
        $ErrorMessage = $_.Exception.Message
        $FailedItem = $_.Exception.ItemName
        Write-Warning ("{0} - {1}" -f $FailedItem, $ErrorMessage)
        exit
    }
}
`
Splaxi commented 3 years ago

Thx Kenneth.

I believe for the untrained, it will feel simpler to use the dbatools.io for the restore, if you don't need to go crazy with roll back and stuff like that.

But good to know 👍

smholvoet commented 3 years ago

Not quite there yet...

Get-D365LcsDatabaseBackups -Latest does find the .bak file I need, but doesn't allow me to use to the blob URL via (Get-D365LcsDatabaseBackups -Latest).FileLocation. I manually had to grab the URL via LCS (like in one of your Wiki examples), which is kinda silly but obviously a restriction of LCS.

Once I finally had the .bak I ran:

Invoke-D365AzCopyTransfer -SourceUri $backup -DestinationUri "c:\temp\downloads\AxDB.bak"

However, restoring the dB failed because the Stop-D365Environment -All didn't seem to have stopped all services 🤔

Server             DisplayName                                                  Status     StartType  Name
------             -----------                                                  ------     ---------  ----
xxx                Microsoft Dynamics 365 Unified Operations: Batch Manageme... Running    Automatic  DynamicsAxBatch
xxx                Microsoft Dynamics 365 Unified Operations: Data Import Ex... Running    Automatic  Microsoft.Dynamics....
xxx                Management Reporter 2012 Process Service                     Stopped    Manual     MR2012ProcessService
xxx                World Wide Web Publishing Service                            Running    Automatic  w3svc
WARNING: [17:34:08][Restore-DbaDatabase] Failure | System.Data.SqlClient.SqlError: Exclusive access could not be obtained because the database is in use.
Splaxi commented 3 years ago

Did you remember to run the powershell session with RunAsAdministrator?

Normally it just works, but there are times where the batch service stalls. There are two ways to fix that. Hardcore kill the service via powershell / classic kill or stop / start the sql service, along the batch service.

The batch might be configured to retry / restart on failure, so you might want to disable it prior running your scripts.

smholvoet commented 3 years ago

Did you remember to run the powershell session with RunAsAdministrator?

bingo

Ok now I feel silly, thanks 😄.

image

I'll report back once I managed to get through my entire "flow". I feel this might be useful for other people and this scenario seems related to #385

Splaxi commented 3 years ago

When it works, you should totally post it on your blog.

Personally I would recommend that you go for the solution where you run the storage account yourself.

Splaxi commented 3 years ago

We would love to have a PR for the wiki, so we can share it with the community. But if you feel like it, just put it on your blog and we'll point to it from our wiki.

kgmadsen commented 3 years ago

Thx Kenneth.

I believe for the untrained, it will feel simpler to use the dbatools.io for the restore, if you don't need to go crazy with roll back and stuff like that.

But good to know 👍

Rool back..??? I only go forward.. ;)

(Just kidden,simple is good)

Splaxi commented 3 years ago

@smholvoet Any news to share?

smholvoet commented 3 years ago

@Splaxi Still testing stuff on my dev machine, but getting close to what I want to achieve. Thinking of throwing in a couple of d365fo.integrations cmdlets as well 👀.

Aiming to publish a blog post in one of the coming weeks ✍️

Splaxi commented 3 years ago

Let me know if you need a hand with anything, I'm more than happy to help you out with the blog post.

valerymoskalenko commented 3 years ago

Maybe something like this, feel free the change it:

Please find just another example of my scripts to restore BAK files here: https://github.com/valerymoskalenko/D365FFO-PowerShell-scripts/blob/master/Invoke-D365FFOAxDBRestoreFromBAK.ps1

Splaxi commented 3 years ago

Hi @valerymoskalenko

I'm impressed, that seems like a very solid set of scripts. One suggestion, if you want to be sure the batch service does not automatically restart, you will need to disable it while restoring the database.

If you have issues with stopping the batch service, just restart the sql server database engine. Or execute a single mode with roll back query.