d365collaborative / d365fo.tools

Tools used for Dynamics 365 Finance and Operations
MIT License
246 stars 101 forks source link

[Invoke-ClearAzureSpecificObjects] Something went wrong while clearing the Azure specific objects in the database. #358

Closed kai74 closed 4 years ago

kai74 commented 4 years ago

Hi I am trying to create a bacpac using "New-D365Backpac

New-D365Bacpac -ExportModeTier2-xxxx -SqlUser xxxxx -BacpacFile D:\DB_backup\update2devdb.bacpac -DatabaseName xxxxxx_axdb_2b18fc3xxxxxx -DatabaseServer xxxxxx.database.windows.net -EnableException -NewDatabaseName update2devdb -ShowOriginalProgress -Verbose

I am ending up with this message all the time: _[13:36:39][Invoke-ClearAzureSpecificObjects] Something went wrong while clearing the Azure specific objects in the database. | Exception calling "ExecuteNonQuery" with "0" argument(s): "The database principal owns a schema in the database, and cannot be dropped." WARNING: [13:36:39][Invoke-ClearAzureSpecificObjects] Stopping because of errors. | Something went wrong while clearing the Azure specific objects in the database.

We are testing with version D365fo.tools : 0.5.64 D365FO is Version A10.0.8 and P Update32 (7.0.5493.16714)

Best regards Kai

Splaxi commented 4 years ago

Could you try the -ExportOnly parameter?

https://github.com/d365collaborative/d365fo.tools/blob/development/docs/New-D365Bacpac.md#-exportonly

Do you know if your environment is a Self-service or the normal kind where you can RDP into your Tier2 AOS?

And may I ask why you are using the tools and not LCS to create a bacpac file from your Tier2? 😁

kai74 commented 4 years ago

We can't use the LCS backup, the size of the database is too big. 600 gb This is the normal kind, where you can RDP into it. I have tested now on an older version (Version: 10.0.3 (10.0.107.20019) P. Update27 (7.0.5286.41363) and it works perfect (i am not done importing the database, but the export worked perfectly. (Used 12+ hours) I have tested with ExportOnly, but it didn't work, it must be something with the version.

In the future hopefully, the lcs function will work. But when I need to update multiple Dev, test environments this functionality this will really be a great help.

Splaxi commented 4 years ago

If you want - I'm happy to help you over a team session, so we can share the screen.

But I'm guessing that there is a new schema in the database, which we need to implement support for.

We are using this script to clear up the database: https://github.com/d365collaborative/d365fo.tools/blob/development/d365fo.tools/internal/sql/clear-azurebacpacdatabase.sql

And at line 56-71 we handle schemas:

https://github.com/d365collaborative/d365fo.tools/blob/29f341e64b1b39c8077934f2ea74bf7e0d79a1d0/d365fo.tools/internal/sql/clear-azurebacpacdatabase.sql#L56-L71

Could you query the database and see what schemas that are in it?

SELECT *
FROM SYS.SCHEMAS
kai74 commented 4 years ago

Hi Motz

Thanks for your reply

This is the result from the query

dbo11 guest22 INFORMATION_SCHEMA33 sys44 ReportingIntegration51 ax61 crt71 ext81 BatchScheduling98 db_owner1638416384 db_accessadmin1638516385 db_securityadmin1638616386 db_ddladmin1638716387 db_backupoperator1638916389 db_datareader1639016390 db_datawriter1639116391 db_denydatareader1639216392 db_denydatawriter1639316393

You are more than welcome to contact me on Teams.

Best regard Kai

From: Mötz Jensen notifications@github.com Reply to: "d365collaborative/d365fo.tools" reply@reply.github.com Date: Tuesday, 7 January 2020 at 09.53 To: "d365collaborative/d365fo.tools" d365fo.tools@noreply.github.com Cc: kai74 kai@kaib.dk, Author author@noreply.github.com Subject: Re: [d365collaborative/d365fo.tools] [Invoke-ClearAzureSpecificObjects] Something went wrong while clearing the Azure specific objects in the database. (#358)

If you want - I'm happy to help you over a team session, so we can share the screen.

But I'm guessing that there is a new schema in the database, which we need to implement support for.

We are using this script to clear up the database: https://github.com/d365collaborative/d365fo.tools/blob/development/d365fo.tools/internal/sql/clear-azurebacpacdatabase.sql

And at line 56-71 we handle schemas:

https://github.com/d365collaborative/d365fo.tools/blob/29f341e64b1b39c8077934f2ea74bf7e0d79a1d0/d365fo.tools/internal/sql/clear-azurebacpacdatabase.sql#L56-L71

Could you query the database and see what schemas that are in it? SELECT * FROM SYS.SCHEMAS — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

Splaxi commented 4 years ago

Hi again,

Could you please try and run this:

SELECT  s.Name, u.*
FROM    sys.schemas s
        INNER JOIN sys.sysusers u
            ON u.uid = s.principal_id

This should show us who owns the schema and make it easier for us to find which one we need to add to the clear script.

kai74 commented 4 years ago

Hi :)

Name uid status name2 sid roles createdate updatedate altuid password gid environ hasdbaccess islogin isntname isntgroup isntuser issqluser isaliased issqlrole isapprole dbo 1 0 dbo 0x01060000000000640000000000000000EEF3C95D270B7C44A6F971A522949460 NULL 2003-04-08 09:10:42.287 2019-12-07 15:51:07.497 NULL NULL 0 NULL 1 1 0 0 0 1 0 0 0 guest 2 0 guest 0x00 NULL 2003-04-08 09:10:42.317 2003-04-08 09:10:42.317 NULL NULL 0 NULL 0 1 0 0 0 1 0 0 0 INFORMATION_SCHEMA 3 0 INFORMATION_SCHEMA NULL NULL 2009-04-13 12:59:11.717 2009-04-13 12:59:11.717 NULL NULL 0 NULL 0 1 0 0 0 1 0 0 0 sys 4 0 sys NULL NULL 2009-04-13 12:59:11.717 2009-04-13 12:59:11.717 NULL NULL 0 NULL 0 1 0 0 0 1 0 0 0 ReportingIntegration 1 0 dbo 0x01060000000000640000000000000000EEF3C95D270B7C44A6F971A522949460 NULL 2003-04-08 09:10:42.287 2019-12-07 15:51:07.497 NULL NULL 0 NULL 1 1 0 0 0 1 0 0 0 ax 1 0 dbo 0x01060000000000640000000000000000EEF3C95D270B7C44A6F971A522949460 NULL 2003-04-08 09:10:42.287 2019-12-07 15:51:07.497 NULL NULL 0 NULL 1 1 0 0 0 1 0 0 0 crt 1 0 dbo 0x01060000000000640000000000000000EEF3C95D270B7C44A6F971A522949460 NULL 2003-04-08 09:10:42.287 2019-12-07 15:51:07.497 NULL NULL 0 NULL 1 1 0 0 0 1 0 0 0 ext 1 0 dbo 0x01060000000000640000000000000000EEF3C95D270B7C44A6F971A522949460 NULL 2003-04-08 09:10:42.287 2019-12-07 15:51:07.497 NULL NULL 0 NULL 1 1 0 0 0 1 0 0 0 BatchScheduling 8 0 axdbadmin 0x01060000000100640000000000000000F56D45D5A63D084590CA9895463EABF7 NULL 2019-12-07 16:10:29.103 2019-12-07 16:10:29.107 NULL NULL 0 NULL 1 1 0 0 0 1 0 0 0 db_owner 16384 0 db_owner 0x01050000000000090400000000000000000000000000000000400000 NULL 2003-04-08 09:10:42.333 2009-04-13 12:59:14.467 1 NULL 16384 NULL 0 0 0 0 0 0 0 1 0 db_accessadmin 16385 0 db_accessadmin 0x01050000000000090400000000000000000000000000000001400000 NULL 2003-04-08 09:10:42.333 2009-04-13 12:59:14.467 1 NULL 16385 NULL 0 0 0 0 0 0 0 1 0 db_securityadmin 16386 0 db_securityadmin 0x01050000000000090400000000000000000000000000000002400000 NULL 2003-04-08 09:10:42.350 2009-04-13 12:59:14.467 1 NULL 16386 NULL 0 0 0 0 0 0 0 1 0 db_ddladmin 16387 0 db_ddladmin 0x01050000000000090400000000000000000000000000000003400000 NULL 2003-04-08 09:10:42.350 2009-04-13 12:59:14.467 1 NULL 16387 NULL 0 0 0 0 0 0 0 1 0 db_backupoperator 16389 0 db_backupoperator 0x01050000000000090400000000000000000000000000000005400000 NULL 2003-04-08 09:10:42.350 2009-04-13 12:59:14.467 1 NULL 16389 NULL 0 0 0 0 0 0 0 1 0 db_datareader 16390 0 db_datareader 0x01050000000000090400000000000000000000000000000006400000 NULL 2003-04-08 09:10:42.363 2009-04-13 12:59:14.467 1 NULL 16390 NULL 0 0 0 0 0 0 0 1 0 db_datawriter 16391 0 db_datawriter 0x01050000000000090400000000000000000000000000000007400000 NULL 2003-04-08 09:10:42.363 2009-04-13 12:59:14.467 1 NULL 16391 NULL 0 0 0 0 0 0 0 1 0 db_denydatareader 16392 0 db_denydatareader 0x01050000000000090400000000000000000000000000000008400000 NULL 2003-04-08 09:10:42.380 2009-04-13 12:59:14.467 1 NULL 16392 NULL 0 0 0 0 0 0 0 1 0 db_denydatawriter 16393 0 db_denydatawriter 0x01050000000000090400000000000000000000000000000009400000 NULL 2003-04-08 09:10:42.380 2009-04-13 12:59:14.467 1 NULL 16393 NULL 0 0 0 0 0 0 0 1 0

Den ons. 8. jan. 2020 kl. 08.36 skrev Mötz Jensen <notifications@github.com

:

Hi again,

Could you please try and run this:

SELECT s.Name, u.* FROM sys.schemas s INNER JOIN sys.sysusers u ON u.uid = s.principal_id

This should show us who owns the schema and make it easier for us to find which one we need to add to the clear script.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/d365collaborative/d365fo.tools/issues/358?email_source=notifications&email_token=ABOL2IRLMQPXDQN5T7EP2GTQ4V7AZA5CNFSM4KDEVLR2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEILOYGA#issuecomment-571927576, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABOL2IQ6X7RWHGME62T4ZPTQ4V7AZANCNFSM4KDEVLRQ .

Splaxi commented 4 years ago

Okay - to get you on with your work, please try this:

You need to locate the path where the d365fo.tools module is installed on the machine.

explorer.exe (Split-Path $(Get-Module d365fo.tools -ListAvailable | Select-Object -First 1).Path -Parent)

image

From here you need to locate the clear-azurebacpacdatabase.sql file.

It is located in the sub folder: internal\sql

image

Important: You need to start notepad with "Run As Administrator" and then open the clear-azurebacpacdatabase.sql file from the location you just found.

On line 62, you need to add the "BatchScheduling" to the list of schemas. So from this:

WHERE SYS.SCHEMAS.NAME IN ('BACKUP','SHADOW')

https://github.com/d365collaborative/d365fo.tools/blob/29f341e64b1b39c8077934f2ea74bf7e0d79a1d0/d365fo.tools/internal/sql/clear-azurebacpacdatabase.sql#L62

to this:

WHERE SYS.SCHEMAS.NAME IN ('BACKUP','SHADOW','BatchScheduling')

Save the clear-azurebacpacdatabase.sql and exit all your powershell consoles / sessions.

And now retry the export once more.

Splaxi commented 4 years ago

For reference, the default path to the mentioned file is: C:\Program Files\WindowsPowerShell\Modules\d365fo.tools\0.5.64\internal\sql\clear-azurebacpacdatabase.sql

That should work if you installed the tools directly from PowerShell using the Install-Module d365fo.tools

kai74 commented 4 years ago

Great. I will test once more. And get back to you ASAP.

Den ons. 8. jan. 2020 kl. 11.21 skrev Mötz Jensen <notifications@github.com

:

For reference, the default path to the mentioned file is: C:\Program Files\WindowsPowerShell\Modules\d365fo.tools\0.5.64\internal\sql\clear-azurebacpacdatabase.sql

That should work if you installed the tools directly from PowerShell using the Install-Module d365fo.tools

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/d365collaborative/d365fo.tools/issues/358?email_source=notifications&email_token=ABOL2IQC4PFTI4O5CZQA3NLQ4WSJFA5CNFSM4KDEVLR2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEIL4VWI#issuecomment-571984601, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABOL2IXAJDOTXECJPAF4D6TQ4WSJFANCNFSM4KDEVLRQ .

Splaxi commented 4 years ago

Just wanted to check on you and see if you faced new issues?

kai74 commented 4 years ago

Hi

It is running the Export now. But not done.. It really takes a loot of time. But it Is past the old error. I will get back to you when it is done

Thanks a lot for all your help

Kai

On Fri, 10 Jan 2020 at 15.59, Mötz Jensen notifications@github.com wrote:

Just wanted to check on you and see if you faced new issues?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/d365collaborative/d365fo.tools/issues/358?email_source=notifications&email_token=ABOL2ISAN2MRS3KYBKE24B3Q5CEO7A5CNFSM4KDEVLR2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEIUFOEQ#issuecomment-573069074, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABOL2ISI5O2NNA6ZF2XD2O3Q5CEO7ANCNFSM4KDEVLRQ .

kai74 commented 4 years ago

It used one day and 59 min to export. But the export is done. Will try to import it now.

Splaxi commented 4 years ago

Fingers crossed đŸ€ž

Splaxi commented 4 years ago

Just wanted to check on you and see if you were able to import the bacpac file?

But I'm going to merge the change into the module this week, based on that you were not able to export the bacpac before and now you are able to export it.

kai74 commented 4 years ago

Hi Mötz

It is all Done. The environment is up and “spinning” . It used 38 hours to import the database.

So this issue is solved.

I really appreciate all the help.

Best regards Kai

From: Mötz Jensen notifications@github.com Reply to: "d365collaborative/d365fo.tools" reply@reply.github.com Date: Monday, 13 January 2020 at 08.18 To: "d365collaborative/d365fo.tools" d365fo.tools@noreply.github.com Cc: kai74 kai@kaib.dk, Author author@noreply.github.com Subject: Re: [d365collaborative/d365fo.tools] [Invoke-ClearAzureSpecificObjects] Something went wrong while clearing the Azure specific objects in the database. (#358)

Just wanted to check on you and see if you were able to import the bacpac file?

But I'm going to merge the change into the module this week, based on that you were not able to export the bacpac before and now you are able to export it.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

Splaxi commented 4 years ago

I'm happy that you were able to get it into your environment.

I don't know if you already know, but we actually have a cmdlet / function in place, that can clear table data in a bacpac file. Lets say that you don't want all batch history and stuff like that. Then the tools can remove that from the bacpac file.

It will extract the bacpac file, which is just a zip file and remove the data the desired table(s) and create a new bacpac file. So you will need your 3 times the storage of the size of the bacpac file to be able to handle this. The upside is that you don't need a SQL Server to import it into, clean it up and then export it. So it should be faster than that.

For each GB of data you delete, you will speed up the import process of bacpac file.

Splaxi commented 4 years ago

https://github.com/d365collaborative/d365fo.tools/blob/development/docs/Clear-D365TableDataFromBacpac.md

kai74 commented 4 years ago

Yes, i have seen it.

It would be nice if you could make this a parameter. So it can do the cleanup before it starts the export from SQL. Then it will be faster :). It really take some type to unzip the bacpac when it is at this big size :)

We only have 30 days old data in the batch history table, but timed saves is time earned

The most significant tables in our database is retail transactions.

tableused_mballocated_mb dbo.RETAILTRANSACTIONSALESTRANS69728.2270182.29 dbo.SALESLINE42045.5242047.38 dbo.LEDGERJOURNALTRANS37386.1637389.61 dbo.TAXTRANS28638.4328639.38 dbo.CUSTTRANS25698.2025700.50 dbo.GENERALJOURNALACCOUNTENTRY25579.5925581.19 dbo.RETAILTRANSACTIONTABLE18579.9618696.79 dbo.INVENTTRANS18393.8818394.91 dbo.INVENTSUMLOGTTS16348.5216349.59 dbo.SOURCEDOCUMENTLINE12172.3612173.17

And when we look at number of rows

TableNameindexNameRowsTotalPages INVENTSUMLOGTTSI_7153TTSIDIDX99892173598922 GENERALJOURNALACCOUNTENTRYI_15242RECID63109328525778 INVENTTRANSPOSTINGI_15502DATEVOUCHERTRANSIDX57393634330153 DMFSTAGINGEXECUTIONERRORSI_9142RECID47104982161185 SOURCEDOCUMENTLINEI_7752RECID43219078187242 TAXTRANSGENERALJOURNALACCOUNTENTRYI_1401RECID41736268206194 TAXTRANSI_2292RECID40391317385994 DIMENSIONFOCUSBALANCEI_7445BALANCEQUERYIDX30229919172953 INVENTTRANSI_9569TRANSORIGINIDX23504322243049 INVENTTRANSORIGINI_8551RECID23500786116154 INVENTREPORTDIMHISTORYI_5902TYPETRANSIDREFIDX2201323799666 SALESPARMLINEI_15005RECID21815890235137 RETAILTRANSACTIONSALESTRANSI_13550TRANSACTIONLINEIDX21094472455082

Best regards Kai

From: Mötz Jensen notifications@github.com Reply to: "d365collaborative/d365fo.tools" reply@reply.github.com Date: Monday, 13 January 2020 at 08.33 To: "d365collaborative/d365fo.tools" d365fo.tools@noreply.github.com Cc: kai74 kai@kaib.dk, Author author@noreply.github.com Subject: Re: [d365collaborative/d365fo.tools] [Invoke-ClearAzureSpecificObjects] Something went wrong while clearing the Azure specific objects in the database. (#358)

https://github.com/d365collaborative/d365fo.tools/blob/development/docs/Clear-D365TableDataFromBacpac.md

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

Splaxi commented 4 years ago

Without knowing your environment - you should look into the -CustomSqlFile parameter of New-D365Bacpac

https://github.com/d365collaborative/d365fo.tools/blob/development/docs/New-D365Bacpac.md#-customsqlfile

This allows you to execute a single SQL file, containing whatever fits your needs. This will be executed BEFORE your export.

I'm guessing that you are running on the first versions of a Tier2 environment, where you can logon to the AOS instances using RDP. So this should work for your scenario.

When you are being servered a Self-Service environment, you shouldn't expect this module to be able help you out. Then you would be forced to use the tools available from LCS, but hopefully at that time, they are able to export the bacpac file for. You will end up with what you have now, a bacpac file containing all data. Then you will have to uptake the mentioned "Clear-D365TableDataFromBacpac" cmdlet / function if you want to remove data. That could be because of size or sensitivity.

kai74 commented 4 years ago

Yes, i have seen it.

I will look into it next time :)

Thanks once more.

From: Mötz Jensen notifications@github.com Reply to: "d365collaborative/d365fo.tools" reply@reply.github.com Date: Monday, 13 January 2020 at 10.04 To: "d365collaborative/d365fo.tools" d365fo.tools@noreply.github.com Cc: kai74 kai@kaib.dk, Author author@noreply.github.com Subject: Re: [d365collaborative/d365fo.tools] [Invoke-ClearAzureSpecificObjects] Something went wrong while clearing the Azure specific objects in the database. (#358)

Without knowing your environment - you should look into the -CustomSqlFile parameter of New-D365Bacpac

https://github.com/d365collaborative/d365fo.tools/blob/development/docs/New-D365Bacpac.md#-customsqlfile

This allows you to execute a single SQL file, containing whatever fits your needs. This will be executed BEFORE your export.

I'm guessing that you are running on the first versions of a Tier2 environment, where you can logon to the AOS instances using RDP. So this should work for your scenario.

When you are being servered a Self-Service environment, you shouldn't expect this module to be able help you out. Then you would be forced to use the tools available from LCS, but hopefully at that time, they are able to export the bacpac file for. You will end up with what you have now, a bacpac file containing all data. Then you will have to uptake the mentioned "Clear-D365TableDataFromBacpac" cmdlet / function if you want to remove data. That could be because of size or sensitivity.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

Splaxi commented 4 years ago

This is solved in 0.5.64