Closed smholvoet closed 3 years ago
Hi @smholvoet
It is a bit unclear what you are after, but let me see if I can answer your questions anyway.
Prerequisites for using the tools and working against the LCS API:
As you correctly state, you will need to run Invoke-D365InstallSqlPackage
and Invoke-D365InstallAzCopy
to help you with all of these things.
For the Registered Application and general high quality information about all of these things, I would recommend that you read the entire blog post from Áriste: https://ariste.info/en/msdyn365-azure-devops-alm but specifically the section about the registered application: https://ariste.info/en/msdyn365-azure-devops-alm/#AAD_app_creation
Note: You will have to set the registered application as Allow Public Client Flows showed in the picture below.
With all that in place, we can start talking about what cmdlets can help you do some of the heavy lifting.
As you learned, the Get-D365LcsApiToken
requires you to fill in a ClientId (ApplicationId from the Registered Application). So for you to be able to get a valid token for the LCS API, you need 3 things: Username, Password and ClientId.
When that works, you will be using the Set-D365LcsApiConfig
to save the details, so you don't need to fill them out every single time you want to invoke the cmdlets. You can call it with single parameters or multiple parameters at the time, it will only override the parameters that you specify. So Set-D365LcsApiConfig -Username "safsdf"
will not override any other detail than the username.
Back to flow of operations you want to do, on a high level, I believe this is want you want:
Invoke-D365LcsDatabaseRefresh
Invoke-D365LcsDatabaseExport
Get-D365LcsDatabaseBackups
and Invoke-D365AzCopyTransfer
.Get-D365LcsDatabaseBackups
in a variable. Then you can filter the output and get that single bacpac you really need.Import-D365Bacpac
Invoke-D365DbSync
. It should be possible to run the dbsync against any database, as long as you specify the database name for the Invoke-D365DbSync
cmdlet.Stop-D365Environment
Switch-D365ActiveDatabase
. It will the "old" database "AXDB_original" if you don't provide another name.Start-D365Environment
One could argue that you just want to switch out the old AXDB as soon as possible, then you would reorder the steps 5,6,7,8 into 6 (Stop) ,7 (Switch),5 (DBSync),8 (Start) instead.
This is the end-2-end flow that needs to happen, if you want to get the latest bacpac file from PROD to a Tier1.
Let me know if you have further questions of I missed something from your requirements.
Next step would be to do a classic SQL backup, which gives you a bak file, which you can then share with all the other Tier1 environments and get a faster restore / import process. That would be solved with the dbatools powershell module, in combination with our Invoke-D365AzCopyTransfer
cmdlet.
Thx for the detailed reply @Splaxi 👍
User account without MFA
Looks like this is the culprit as it results in:
[11:36:57][Invoke-Authorization] Something went wrong while working against Azure Active Directory (AAD) | The remote server returned an error: (400) Bad Request.
WARNING: [11:36:57][Invoke-Authorization] Stopping because of errors
I'll have to look into creating a separate service account without 2FA and add said account to our LCS project.
Regarding the steps which you've mentioned above, I'm trying to do something similar, but on our cloud-hosted environments (not T1) in case they have just been redeployed and still have the default AxDB. So I'm guessing I'll need:
Get-D365LcsDatabaseBackups | Invoke-D365AzCopyTransfer
<dbatools magic>
Stop-D365Environment
Switch-D365ActiveDatabase
Invoke-D365DbSync
Start-D365Environment
When you say:
Regarding the steps which you've mentioned above, I'm trying to do something similar, but on our cloud-hosted environments (not T1) in case they have just been redeployed and still have the default AxDB. So I'm guessing I'll need:
Are you talking about Tier2?
The only thing that the LCS API will allow you to download is bacpac files. If you really have classic SQL Server bak files, from a SQL Server backup, you have other options.
Run your own Azure Storage Account, where you put all your bak files. Use the Get-D365AzureStorageFile
and Invoke-D365AzCopyTransfer
to download the file. From there you need the magic from dbatools (I can provide that).
I'm not talking about T1/T2 environments but cloud-hosted dev boxes:
We have regular backups of our test environment (.bak
) which we upload to the LCS library. Uploading them to a dedicated storage account might indeed make things easier when it comes to automating this process.
Could you share a sample of the relevant dbatools commands to restore a .bak
which then replaces the current AxDB
? This could be really helpful.
What threw me off was your statement Tier1.
A Tier1 can be 3 different things.
Stop-D365Environment -All
Restore-DbaDatabase -SqlInstance localhost -DatabaseName "AxDB" -ReplaceDbNameInFile -WithReplace -Path "C:\Temp\AXDB.bak"
Rename-DbaDatabase -SqlInstance localhost -Database "AxDB" -LogicalName "<DBN>_<FT>"
Maybe something like this, feel free the change it:
`#Load Assamblies [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
$Script:ScriptDir = Split-Path -Parent $PSCommandPath; $SqlFileName = join-path $Script:ScriptDir '_TSqlDbScript.sql'
Set-Location $Script:ScriptDir
$Script:FilePathRestore = ('\YOURPATHTOTHEBAKFILE'); $Script:BakFile = Get-ChildItem -Path $Script:FilePathRestore -Filter '*.bak' | Sort-Object LastAccessTime -Descending | Select-Object -First 1 $Script:SqlDatabaseName = ('YOURSQLDBNAME')
Write-Output "Creating SQL Script..." try { Remove-Item -Path $SqlFileName -Force -Verbose -ErrorAction SilentlyContinue -WarningAction SilentlyContinue New-Item $SqlFileName -ItemType file -Force -Verbose
# Restore backup from Source server, read the content of the file
Add-Content $SqlFileName ("{0}" -f (Get-Content -Path $Script:BakFile.FullName))
Add-Content $SqlFileName ("GO")
Add-Content $SqlFileName ("USE [{0}]" -f $Script:SqlDatabaseName)
Add-Content $SqlFileName ("ALTER DATABASE [{0}] SET RECOVERY SIMPLE" -f $Script:SqlDatabaseName)
Add-Content $SqlFileName ("GO")
Add-Content $SqlFileName ("IF NOT EXISTS (SELECT name FROM [sys].[database_principals] WHERE [type] = 'U' AND name = N'YOURSERVICEACCOUNT')")
Add-Content $SqlFileName ("BEGIN")
Add-Content $SqlFileName (" CREATE USER [YOURSERVICEACCOUNT] FOR LOGIN [YOURSERVICEACCOUNT] WITH DEFAULT_SCHEMA=[dbo]")
Add-Content $SqlFileName (" ALTER AUTHORIZATION ON SCHEMA::[db_owner] TO [YOURSERVICEACCOUNT]")
Add-Content $SqlFileName (" ALTER ROLE [db_owner] ADD MEMBER [YOURSERVICEACCOUNT]")
Add-Content $SqlFileName ("END")
try {
Invoke-sqlcmd -ServerInstance $Script:SqlServerInstance -InputFile $SqlFileName -QueryTimeout 0 -SuppressProviderContextWarning -Verbose -ErrorAction Stop
}
catch {
Write-Output "Sql error ! ->"
$ErrorMessage = $_.Exception.Message
$FailedItem = $_.Exception.ItemName
Write-Warning ("{0} - {1}" -f $FailedItem, $ErrorMessage)
exit
}
}
`
Thx Kenneth.
I believe for the untrained, it will feel simpler to use the dbatools.io for the restore, if you don't need to go crazy with roll back and stuff like that.
But good to know 👍
Not quite there yet...
Get-D365LcsDatabaseBackups -Latest
does find the .bak file I need, but doesn't allow me to use to the blob URL via (Get-D365LcsDatabaseBackups -Latest).FileLocation
. I manually had to grab the URL via LCS (like in one of your Wiki examples), which is kinda silly but obviously a restriction of LCS.
Once I finally had the .bak I ran:
Invoke-D365AzCopyTransfer -SourceUri $backup -DestinationUri "c:\temp\downloads\AxDB.bak"
However, restoring the dB failed because the Stop-D365Environment -All
didn't seem to have stopped all services 🤔
Server DisplayName Status StartType Name
------ ----------- ------ --------- ----
xxx Microsoft Dynamics 365 Unified Operations: Batch Manageme... Running Automatic DynamicsAxBatch
xxx Microsoft Dynamics 365 Unified Operations: Data Import Ex... Running Automatic Microsoft.Dynamics....
xxx Management Reporter 2012 Process Service Stopped Manual MR2012ProcessService
xxx World Wide Web Publishing Service Running Automatic w3svc
WARNING: [17:34:08][Restore-DbaDatabase] Failure | System.Data.SqlClient.SqlError: Exclusive access could not be obtained because the database is in use.
Did you remember to run the powershell session with RunAsAdministrator?
Normally it just works, but there are times where the batch service stalls. There are two ways to fix that. Hardcore kill the service via powershell / classic kill or stop / start the sql service, along the batch service.
The batch might be configured to retry / restart on failure, so you might want to disable it prior running your scripts.
Did you remember to run the powershell session with RunAsAdministrator?
Ok now I feel silly, thanks 😄.
I'll report back once I managed to get through my entire "flow". I feel this might be useful for other people and this scenario seems related to #385
When it works, you should totally post it on your blog.
Personally I would recommend that you go for the solution where you run the storage account yourself.
We would love to have a PR for the wiki, so we can share it with the community. But if you feel like it, just put it on your blog and we'll point to it from our wiki.
Thx Kenneth.
I believe for the untrained, it will feel simpler to use the dbatools.io for the restore, if you don't need to go crazy with roll back and stuff like that.
But good to know 👍
Rool back..??? I only go forward.. ;)
(Just kidden,simple is good)
@smholvoet Any news to share?
@Splaxi Still testing stuff on my dev machine, but getting close to what I want to achieve. Thinking of throwing in a couple of d365fo.integrations cmdlets as well 👀.
Aiming to publish a blog post in one of the coming weeks ✍️
Let me know if you need a hand with anything, I'm more than happy to help you out with the blog post.
Maybe something like this, feel free the change it:
Please find just another example of my scripts to restore BAK files here: https://github.com/valerymoskalenko/D365FFO-PowerShell-scripts/blob/master/Invoke-D365FFOAxDBRestoreFromBAK.ps1
Hi @valerymoskalenko
I'm impressed, that seems like a very solid set of scripts. One suggestion, if you want to be sure the batch service does not automatically restart, you will need to disable it while restoring the database.
If you have issues with stopping the batch service, just restart the sql server database engine. Or execute a single mode with roll back query.
What would be the correct sequence of commands in order to achieve the following?
I'm trying to automate the steps above on a freshly deployed development machine, meaning I didn't run
Set-D365LcsApiConfig
yet.However, I ran into a couple of issues.
Configure LCS API
Running the command below:
⚠️ Error:
Cannot bind argument to parameter 'ClientId' because it is an empty string.
Which means I probably need an Azure app registration with a clientId and a secret? Does the app registration require special API permissions, or do I need to run additional commands?
Switch to EU LCS API, this might clear out the
clientId
andBearer
I guess?DB restore
List all database backup files:
Get a specific DB backup, is there an option to overrule the
Latest
parameter and point to a specific backup?Install SqlPackage:
Restore the .bak... Is there a d365fo.tools command which covers this?
Sync & start
Thanks.