Open devblackops opened 8 years ago
I just closed the pull request about integration tests I had opened quite a while ago.
Do you think unit tests will be cost effective ? Most of the functions in the module are very shallow (do not contain much logic) to the point that I started generating some of them. Also, even if we can mock a Netscaler with a tool like http://wiremock.org/ to write the mock, you would have to have a very fine grained knowledge of the Nitro API.
That is why I though at the time that "integration tests" that would actually talk to a Netscaler would be more useful. What do you think ?
Ya, testing is hard 😄.
I like the idea of integration tests but I'm not sure on the best way to implement that on a module that is 100% dependent on an external service. I'm sure there are some patterns out there but I have a feeling they will just come back to extensive use of mocked objects or hitting a real API.
It's looks like it is possible to spin up a throwaway NetScaler instance in AWS/Azure. That may be a viable solution if it only needs to be running for a few minutes.
Yes I agree. I would add that in our case "extensive" is synonym to "expensive" regarding mock objects as we have no way of actually knowing what the API does without checking it with a live system. I would even argue that creating the mocks would be more expensive than creating the actual code and tests. The way I see it you pushed the unit tests to almost the max ROI.
The path I was following was to setup a vanilla Netscaler instance on my laptop and then using it to test in situ that the module functions work as expected. I tinkered with several base bricks that could be combined:
An very simple test would be:
Describe "Netscaler" {
$Session = Connect-Netscaler -Hostname $Nsip -Credential $Credential -PassThru
It "should add a LB server" {
New-NSLBServer -Name 'srv-test' -IPAddress 1.2.3.4
Compare-NSConfig $OldConf | Should Match "=> add server srv-test 1.2.3.4"
}
BeforeEach { $OldConf = Get-NSConfig }
AfterEach { Clear-NSConfig -Force }
}
What I have done so far is available in this branch: https://github.com/dbroeglin/NetScaler/blob/feature/it-tests/ with my notes on how to initially setup the NS instance in the TESTING.md
file.
I like the idea of spinning up an instance on AWS/Azure, as my setup works nicely for development on a laptop, but lacks automation when building with Appveyor. FYI, it is not right at the top of my TODO list, but automatically building a NS instance in Azure is something I need to look into anyway.
Does the above make sense to you or do you think I'm spinning my wheels ?
Hi, sorry for the long interlude. I finally was able to work on the subject again.
This branch contains a first example of what I had in mind: https://github.com/dbroeglin/NetScaler/tree/feature/integration-tests
The README documents how to setup your environment. An a very trivial example of tests can be seen here: https://github.com/dbroeglin/NetScaler/blob/feature/integration-tests/ITTests/Simple.ITTests.ps1
Configuration stuff like NSIP, username and password are not yet properly externalized:
https://github.com/dbroeglin/NetScaler/blob/feature/integration-tests/ITTests/TestSupport.ps1
If you think it is worth it pursuing this testing avenue I will work on it more and provide ITTests that are more relevant (I have a few cases in mind already from what I do at work).
Also, you might be interested in https://github.com/dbroeglin/windows-lab2 which is a slightly different approach. In that project I want to actually test functionality (like ADFS authentification, reverse proxy, load balancing, Kerberos protocol transition, etc.) so I'm creating a full lab with the Lability module (https://github.com/VirtualEngine/Lability).
@devblackops any thoughs on my proposition ?
In order to test and maintain the quality of this module, unit tests for all functions should be created. We already have tests for meta information such as comment-based help, the module manifest, and text encoding. In the dev branch, I've reorganized the
Tests
folder a bit to hopefully help manage all the various tests.There is a commented out test script at
Tests/meta/Function.Coverage.tests.ps1
that is intended to verify that all files underNetScaler/Public
andNetScaler/Private
have a matching test script inTests/unit/public/
orTests/unit/private/
respectively. I've leaving this commented out for now to keep the build from failing until the various tests are created. It will probably be awhile until all the tests are written and that is OK.Since this module relies so heavily on an external endpoint (NetScaler), I imagine the unit tests will have extensive use of mocking. I created a folder at
Tests/unit/mocks
to hold mock objects that can be reused amongst the tests.I'm not entirely sure this is the best approach to managing the tests and I'm open to alternative approaches to managing the structure. I just thought I'd throw this out there.
Expected Behavior
Create a Pester unit test file under
Tests/unit/public/<function-name>.tests.ps1
for all publicly exposed functions and a unit test file underTests/unit/private/<function-name>.tests.ps1
for private ones.Current Behavior
Missing unit tests for all functions.
Possible Solution
Create a Pester test file for each public/private function.
Context
Improve the quality of the module.