packer-community / winrmcp

Copy files to a remote host using WinRM
MIT License
59 stars 31 forks source link

Externalize the powershell code? #4

Open maxlinc opened 9 years ago

maxlinc commented 9 years ago

I think it might be good to externalize the powershell code rather than embedding it within go. I've been thinking the same thing for a while about https://github.com/WinRb/winrm-fs, so thought I'd start a discussion here.

/cc @sneal because I could have just as easily written this issue on winrb/winrm-fs.

Here's what I'm talking about. The current code has stuff like this:

    script := fmt.Sprintf(`
        $tmp_file_path = [System.IO.Path]::GetFullPath("%s")
        $dest_file_path = [System.IO.Path]::GetFullPath("%s")
        if (Test-Path $dest_file_path) {
            rm $dest_file_path
        }
        else {
            $dest_dir = ([System.IO.Path]::GetDirectoryName($dest_file_path))
            New-Item -ItemType directory -Force -ErrorAction SilentlyContinue -Path $dest_dir | Out-Null
        }
        if (Test-Path $tmp_file_path) {
            $base64_lines = Get-Content $tmp_file_path
            $base64_string = [string]::join("",$base64_lines)
            $bytes = [System.Convert]::FromBase64String($base64_string) 
            [System.IO.File]::WriteAllBytes($dest_file_path, $bytes)
        } else {
            echo $null > $dest_file_path
        }
    `, fromPath, toPath)

I think there would be some benefit to:

So the final go code would become:

  // Something to load the function
  // This could be as simple as reading the ps1 file and sending the content via WinRM
  // Or could be something like `Install-Module -Name WinRMFileUtils` (see advantages below)
 script := fmt.Sprintf(`RestoreFile "%s" "%s"`, fromPath, toPath)

The advantages are:

I'm not saying you should do all of this right away. The first step would just be moving the powershell code from a .go file into a .ps1 file. That clears the path for things like Pester testing or sharing code with winrm-fs, but doesn't require it.

Thoughts?

sneal commented 9 years ago

@maxlinc I agree, take a look at this.

dylanmei commented 9 years ago

Absolutely.

The "Elevated Shell" / Scheduled Task problem, which isn't what you are addressing specifically, is 10x as bleh and tedious as the file restore.

I think you were saying it, but I want to be sure, for winrmcp (and its main use-case, Packer) I can't assume an internet connection, so fetching dependencies (PowerShellGet or otherwise) at runtime is a no-go. But to have blessed files/modules that we collaborate on, and could be bundled into this project as a build dependency, that would be amazing.

maxlinc commented 9 years ago

Yeah, I figured there'd be a chicken/egg issue w/ the module to ship files. That might need to directly execute the code to define the RestoreFile function, but at least once you do that you can use it to ship .nupkg or .psm1 files to load additional modules w/ higher-level features. You could also try Find-Module first, so that this only has to happen once, and only if there isn't an internet connection or an an internal powershellget repository. If you the recent version of the module has already been transferred you can just import it.

In fact, it's probably worth playing around with nupkg files anyways. They're basically just zip files w/ extra metadata, but since it's how nuget/chocolatey/oneget and octopusdeploy all ship files to Windows it's probably worth taking a closer look and see if there's any benefit over a normal zip. Either nupkg or zip might be a good way to do a "dumb" transfer (like scp'ing and extracting a tarball), which is fine for sending powershell modules, but there's still a need for a "smart" (more like rsync) behavior to sync larger files and folders.

FYI: OneGet/PowerShellGet is still incubating and so won't be available on older versions of windows, but Zip support is available in PowerShell... as long as they aren't password protected zips...