ticketmaster / poshspec

Infrastructure Testing DSL running in Pester
MIT License
183 stars 32 forks source link

[Feature Request] Ability to run remote tests and grouping of tests #6

Open rchaganti opened 8 years ago

rchaganti commented 8 years ago

I was working on a similar project as poshspec internally but did not make much progress. Now that I see this project, I don't think I need to invest in creating a new framework altogether. However, here is something I was trying to achieve in my code.

node host1, host2, host3 {
    test1 param1 { should be value1 }
    test2 param2 { should be value2 }
    test3 param3 { should be value3 }
}

The Node block let's me specify a set of remote systems where my the tests must be executed. I was targeting PS Remoting to help me execute the test remotely and assert the result locally. For this, I started modifying Remotely and added support for specifying node names as a part of DSL and supply a credential hash when needed. Here is how the tests are written today.

$CredHash = @{
    'VM1' = (Get-Credential)
}

Describe "Add-Numbers" {
    It "adds positive numbers on two remote systems" {
        Remotely 'VM1','VM2' { 2 + 3 } | Should Be 5
    }

    It "gets verbose message" {
        $sum = Remotely 'VM1','VM2' { Write-Verbose -Verbose "Test Message" }
        $sum.GetVerbose() | Should Be "Test Message"
    }

    It "can pass parameters to remote block with different credentials" {
        $num = 10
        $process = Remotely 'VM1' { param($number) $number + 1 } -ArgumentList $num -CredentialHash $CredHash
        $process | Should Be 11
    }
}

However, this is still not sufficient as I don't have the remote system context in the assert result from Should. Here is where I was thinking that I will put a higher level wrapper on this and create something like what I proposed in my first example.

At the end of all this, there are two goals:

cdhunt commented 8 years ago

I made an early decision to not include any remote execution as part of Poshspec to keep the module focused as a collection of simple tests. Adding remote execution into the tests could cause a lot of complication that impacts your test results. If one machine times out, do you fail the whole test? Do you create a test for every machine? I wasn't looking to build a module for auditing all of your infrastructure with a single script. Let DSC do that for you. Use Poshspec in your Infrastructure Pipeline to test that your DSC configuration (or other automation) is producing a functional system and then use DSC to apply and monitor that configuration to all your systems.

That being said, I have a suggested workflow for running Poshspec scripts across a number of systems. The Operation Validation Framework is designed to allow you to package Pester based functional tests into a module for easy distribution and execution. With OFV and an internal PowerShell Gallery, you will have a versioned repository of tests that you can install on any number of systems with just a couple of commands.

Maybe what you are trying to accomplish belongs in OVF which is focused on the execution of these type of tests.

I would definitely like to keep the conversation going.

rchaganti commented 8 years ago

I am not necessarily looking at only CI/CD pipeline. Many of the operational tests that I write work on a similarly configured systems (think of a cluster) and ensure that they are functional. I am just looking for a resource in desired state but that desired state is the desired functional state or not. Also, several of my tests are used as diagnostic tests as well. So, the ability to run and collect results from remote systems using either a single script or a collection of scripts is very much desired. For me operations validation is not about tests running on each individual system. I was (and still) trying to build the capability to treat my infrastructure as a single entity irrespective of how many components it has.

OVF isn't an answer as well at the moment. It is nothing but just a wrapper around pester tests that formats the output in a different way and packages tests as modules.

Well, our goals are a little different. I will continue to evolve Remotely and see where we can meet again! :)

cdhunt commented 8 years ago

I totally get using Pester/Poshspec for continuous functional validation and even monitoring. However, in that case, I still don't see remote execution and reporting as feature requirements for this module at the moment. I think that should be handled by something else in the pipeline. Something like Remotely.

I'd like to hear what others think as well so I'm leaving the Issue open.

juanfperezperez commented 8 years ago

Because of the way a lot of the tests in this module work it makes sense that they all be ran locally. In order to use Pester/Poshspec to test remote systems it would make more sense to deploy the entire test to the remote system and then run it locally as opposed to running every individual test remotely. This way if the connection to the remote system fails then you can address that issue specifically without having to wait for all the tests to fail. If what is being tested is the ability to get at the information remotely, then that would be a totally different set of tests and both, the locally ran and the remotely ran, sets of tests can be part of a single over arching testing solution that utilizes Pester, Poshspec, OVF, and any other tool necessary to facilitate the aggregation of the tests.

jhoneill commented 8 years ago

My first thought was "the framework does not have the support for a computer name." There are some tests where you want to know (for example) that a service is running on each of 3 servers, but you don't want to or can't deploy modules to those servers. So some remoting is required.
Some of the tests only make sense to run locally, and others (service, cimobject, hotfix, localgroup) are wrappers for commands which take -ComputerName. So wrapping them in a way which prevents using it ... is something that one should only do if it is completely unavoidable.

Adding a -ComputerName parameter to Get-PoshspecParam would mean that the individual tests would have the option to run remotely. The TestExpression just needs to include $ComputerName in the same way that $target is already used. (and the name string would need a tweak to include it).

I can understand not wanting to broaden the scope, but if you end up with a separate version to support remoting... not good.

cdhunt commented 8 years ago

I've started a branch to work on remote execution support. My plan is to use Invoke-Command to handle the remote execution. That way it is handled consistently module wide regardless of the Cmdlet called and I can also use the new options for Invoke-Command like -ContainerName and -VMName.

For example:

Describe 'Services' {    
    OnNode somehost {
        Service w32time Status { Should Be Running }
    }

    OnContainer happy_gates {
        Service w32time Status { Should Be Running }
    }

    OnVM wintest2008{
        Service w32time Status { Should Be Running }
    }
}

In theory, the On* functions will accept an array and just loop through them executing tests within that scope for all nodes. This way you'll get 10 tests for 10 servers instead of one test that might fail if a single machine isn't reachable.

I'm thinking of changing the Test name formatting to be machine readable instead of human readable so someone could easily parse it and do more with the results like group/filter based on ComputerName or Test type.

jhoneill commented 8 years ago

I was imagining something which looked more like this Describe "My File Services" { service lanmanserver status {should be running} -onNode FS-01 service lanmanserver status {should be running} -onNode FS-02 } and the code in Service handles the remoting.
For the same test being run on multiple servers, what you're proposing is neater. Describe "My File Services" { onNode FS-01, FS-02 { service lanmanserver status {should be running} } When you want to test 10 different things, each on a different server it's not so neat. I was convinced that was the most common case, but the more I think about the more I think it's the minority. So let's try what you're proposing and see :-)

DexterPOSH commented 7 years ago

@cdhunt I have worked on a similar project for inhouse use, from a lot of guidance from @rchaganti. We started with a fork of the Remotely project in GitHub and then grew out of it, by creating a DSL for the remote Ops validation. I am in the process of cleaning it up and making it public here at GitHub.

It can use the tests written in PoshSpec with a slight changes to accept a parameter hash. Something like below :-

# DSC Style ConfigurationData for Environment configuraions
$ConfigData = @{
    AllNodes = @(
        @{
            NodeName='*';
            DomainFQDN='dexter.lab';
            DomainDC = 'AD.dexter.lab'
        },
        @{
            NodeName="Hyper-VHost1";
            Type='Compute';
            DeploymentStatus = 'Deployed';
            ManagementIPv4Address = '192.168.1.101';
            Storage1IPv4Address = '192.168.10.101';
            Storage2IPv4Address = '192.168.20.101'
        },
        @{
            NodeName='NAS01';
            Type='Storage';
        }
    )
}    

PSRemotely -ConfigurationData $ConfigData {
    #region compute nodes block
    Node $AllNodes.Where({$PSitem.Role -eq 'Compute'}).NodeName {

        # these tests validate the connectivity using a specific NIC
        Describe 'TestADConnectivity' -Tags AD {

            Context 'AD reachable over Mgmt network' {
                #TCPPortWithSourceAddress @ParamHash { Should Be $true }
                TCPPortWithSourceAddress @{ComputerName=$node.DomainDC;SourceIP=$node.ManagementIPv4Address;Port=389}  { Should Be $true }  
            }   
        }

    }
}

Let me know if this interests you.