dataplat / dbatools

🚀 SQL Server automation and instance migrations have never been safer, faster or freer
https://dbatools.io
MIT License
2.48k stars 805 forks source link

Backup-DbaDatabase with BackupFileName creates ps-errors #1890

Closed waffel closed 6 years ago

waffel commented 7 years ago

Bug Report

General Troubleshooting steps

Version Information

Steps to Reproduce

running command in powershell ($sqlcred was set before) like

Backup-DbaDatabase -SqlInstance 'localhost,6201' -SqlCredential $sqlcred -Databases MYDB -BackupDirectory D:\applications\prg\Backup\sql -BackupFileName MYDB.bak -Type Full -CompressBackup
$error[0]

creates the following output:

WARNING: [Backup-DbaDatabase][11:46:45] SQL Server cannot write to the location D:\applications\prg\Backup\sql\
New-Object : A constructor was not found. Cannot find an appropriate constructor for type System.IO.FileInfo.
At line:364 char:13
+ ...            $file = New-Object System.IO.FileInfo($FinalBackupPath[0])
+                        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : ObjectNotFound: (:) [New-Object], PSArgumentException
    + FullyQualifiedErrorId : CannotFindAppropriateCtor,Microsoft.PowerShell.Commands.NewObjectCommand

Split-Path : Cannot bind argument to parameter 'Path' because it is an empty array.
At line:474 char:30
+                 BackupFile = (split-path $FinalBackupPath -leaf)
+                                          ~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidData: (:) [Split-Path], ParameterBindingValidationException
    + FullyQualifiedErrorId : ParameterArgumentValidationErrorEmptyArrayNotAllowed,Microsoft.PowerShell.Commands.SplitPathCommand

It works, when running without the -BackupFileName option (and creates the filename with the date-time stemp fine). It seems the -BackupFileName option is broken somehow.

Stuart-Moore commented 7 years ago

Thanks for the full error report. I'll have a look at it.

Stuart-Moore commented 7 years ago

Hi, Did the backup actually complete? the only way I could reproduce this was with a missing drive spec. Removing filename stopped the nasty warning, but the output wasn't for a completion: image

I'm putting some better path checks in, but want to make sure it was that causing the issue rather than something else

waffel commented 7 years ago

Yep ... without the parameter (only directory) the job completes fine:

Backup-DbaDatabase -SqlInstance 'localhost,6201' -SqlCredential $sqlcred -Databases MYDB -BackupDirectory D:\applications\prg\Backup\sql -Type Full -CompressBackup
SqlInstance      : localhost,6201
DatabaseName     : MYDB
BackupComplete   : True
BackupFilesCount : 1
BackupFile       : MYDB_201707251331.bak
BackupFolder     : D:\applications\prg\Backup\sql
BackupPath       : D:\applications\prg\Backup\sql\MYDB_201707251331.bak
Script           : BACKUP DATABASE [MYDB] TO  DISK = N'D:\applications\prg\Backup\sql\MYDB_201707251331.bak' WITH NOFORMAT, NOINIT, NOSKIP, REWIND, 
                   NOUNLOAD, COMPRESSION,  STATS = 1
Verified         : False
Type             : Full
Stuart-Moore commented 7 years ago

Afraid I've just not been able to reproduce this. And we've not have any other reports, so afraid closing this as cannot repro,.

it was potentially an issue with the client machine as I can't imagine any machines without System.IO.FileInfo being available, fairly common .net assembly,

amaechler commented 6 years ago

@Stuart-Moore I was running into the same issue on a server where the credentials used do not have permissions to read from the filesystem (but has permissions to write).

Stuart-Moore commented 6 years ago

Hi, @waffel you're using a really old version of the module. Could you update to the latest and try again please.

Andreas, that's a special case. I've added an ignorefilechecks switch in the backupnames branch if you want to take it for a test drive

On Sat, 14 Apr 2018 at 15:29, Andreas Maechler notifications@github.com wrote:

@Stuart-Moore https://github.com/Stuart-Moore I was running into the same issue on a server where the credentials used do not have permissions to read from the filesystem (but has permissions to write).

— You are receiving this because you were mentioned.

Reply to this email directly, view it on GitHub https://github.com/sqlcollaborative/dbatools/issues/1890#issuecomment-381333019, or mute the thread https://github.com/notifications/unsubscribe-auth/AFWoZkL5lgi0locvN9G3TEfAWvNKeodPks5togfFgaJpZM4OiVlw .

amaechler commented 6 years ago

@Stuart-Moore I gave your branch a quick test, but am still running into an error. If I'm not mistaken, setting IgnoreFileCheck to $true now causes $FinalBackupPath to not be ever set.

waffel commented 6 years ago

ok, works with dbatools 0.9.325.

There are other issues with the command itself, but here I will create a different issue.

This can be closed. So the BackupFileName parameter works and is honored.

Thanks for pinging me back!

Stuart-Moore commented 6 years ago

Reopening this as @amaechler issue is different and has exposed some flaws in the path parsing, so will hang the PR off this

Good to meet you at PSConfEU @waffel

Stuart-Moore commented 6 years ago

After some more digging, I believe this is expected behaviour from SQL Server, and hence how SMO works:

I created a WriteOnly folder for my SQL Service account. (c:\sqlwo3). backup database master to disk='c:\sqlwo3\test2.bak' - works as expected restore headeronly from disk='c:\sqlwo3\test2.bak' - fails as expected backup database master to disk='c:\sqlwo3\test2.bak' - fails

As we're appending another backup to the file, SQL Server needs to read the headers so it knows the offset to start pushing data in for the new backup.

When a filename is provided we're going to explicitly write into the file, so it already exists then it's going to throw a big error no matter what we do as the underlying SMO method is going to die horribly

When you don't specify a filename (just a directory) we generated the filename with a timestamp on the end, this timestamp is down to the minute. So if you did 2 quick backups in succession it would also fail, a bit more time between them and it's all good.

amaechler commented 6 years ago

@Stuart-Moore Good points and analysis.

I think the ability to provide your own filename (assuming you know that the file does not exist, as based on your analysis it will fail otherwise) is still beneficial, especially in the context of scripting something where Backup-DbaDatabase is just one step.

I initially discovered this issue while writing a script that takes a backup using Backup-DbaDatabase, copies the file somewhere else and then does some more modifications to it. Ideally, I could pass in the filename (maybe even creating the same timestamp filename format) and then use that to manipulate the file further.

As a workaround, I currently call

$tempBackupFileName = "${SourceDatabase}_$(Get-Date -Format yyyyMMddHHmm).bak"

immediately before calling Backup-DbaDatabase to get a handle for the filename (which could potentially fail if the call happens just before the new minute).

Stuart-Moore commented 6 years ago

PR is in and will probably get merged later. In your case, you can specify a file name which will be honoured. Either of: Backup-DbaDatabase -SqlInstance server -database db1 -BackupFileName c:\backups\backup.bak or Backup-DbaDatabase -SqlInstance server -database db1 -BackupDirectory c:\backups\ -BackupFileName backup.bak

So you can pregenerate the filename before hand and pass it in

$tempBackupFileName = "${SourceDatabase}_$(Get-Date -Format yyyyMMddHHmm).bak"
Backup-DbaDatabase -SqlInstance server -database db1 -BackupDirectory c:\backups\ -BackupFileName $tempBackupFileName
Stuart-Moore commented 6 years ago

Fixed with #3558