Closed ktbyers closed 5 years ago
Using unittest.mock or anything else in mind (like the MockSSH server) or both?
I am probably not going to do this.
Right now I run tests against real devices using pytest. I do some sanity checks using Travis-CI (linting and verifying import netmiko
works).
I might add some automated execution of the tests (i.e. run the test suite nightly and have it email me the results). I run the test suite before any release (though I only have a set of platforms available to test against i.e. the platforms listed in the readme as being regularly tested).
I think more valuable than doing this is to expand the test suite that we run against real devices.
I think more valuable than doing this is to expand the test suite that we run against real devices.
I agree that it is important and necessary to have integration tests, but the lack of unit and functional/mock tests means that contributors cannot test the code themselves, unless they have access to all those devices (they might have a few of them at best, but not all). Contributors should not have to create a pull request to test their code, that's inefficient.
Let me look at what you submitted in the PR. Unit tests are probably okay.
Let me think about Mock tests some more.
Note, they can test their code. The current situation just requires them to test it against their own device(s).
My point exactly and that's why I can only test on Linux at home, which means my pull request might break the Juniper-related code and I would have to wait for Travis to tell me what is wrong.
About mock test, I took a look at the tests in paramiko
. They went for the mock server solution, rather than mocking every single function for each test, it seems indeed more reasonable.
https://github.com/paramiko/paramiko/blob/master/tests/test_client.py
My point is mock is going to tell you very little about whether you broke Juniper or not (the thing that matters is the timing and the device's CLI behavior i.e. devices behaving differently on the CLI than we expected).
Mock would not have helped in a meaningful way on your proposed timing changes (if anything it would probably deceive you i.e. give you a false sense you weren't breaking things when you were).
Paramiko is much easier to to mock than Netmiko. Linux is much more uniform in its behavior and timing is much less of an issue. Sure we can mock it, but if we spend 100 hours on it and it doesn't provide us significant, useful information, then that is not an improvement (and maintaining mocked behavior is a lot of work also i.e. it is not just N hours of one-time work).
Mock or no-mock, you will have a very hard time making meaningful Netmiko changes without access to network devices (once again because device timing and device behavior is so important). Virtual network devices are pretty easy to obtain; you can also use labs.networktocode.com to spin-up network devices on demand for a pretty nominal cost.
Let me think about mocking some more as there are some things it would provide useful information on.
From my packager point of view I would like to have at least some tests, so that I can check that my package is basically functional (no missing dependencies, the right files are on the right place, etc).
@dtantsur Yes, I agree adding more unit tests make sense...we should definitely do that.
Mock tests I still want to think about some more. Right now instead of mock tests there is a functional test suite that gets run against a set of platforms (though only that set of test devices that I have in my lab environment).
Closing as there is not any active work going with this right now. It would be good to have, but neither I nor anyone else are working on it.
Hi Any plans to add mock testing, like "scrapli replay" (https://scrapli.github.io/scrapli_replay/user_guide/basic_usage/) for "scrapli" to Netmiko?
For improved testing and to allow for better integration to CI tooling (when real devices aren't present/available).