Open AdamMajer opened 9 years ago
Great idea, this would be very cool to have (ideally in Travis CI or, if that's not feasible, in a Dockerized workload triggered by SUSE's internal Jenkins host).
Pull requests are very much welcome in this direction!
This can be done in Travis CI, to an extent, but maybe that approach is a little "overkill" since aim here would be to test functionality between salt versions. For regular testing, pre-generated test cases should suffice.
I propose,
calls.*
packagesThen we get best of both worlds - reasonable fast unit tests, and tests vs. real salt during the "generate_tests" phase. I think it would be sufficient to test
Automatically generated tests would save a lot of boiler plate code (in unit tests) too, in addition to getting up-to-date saltstack responses.
I will see if I can get to this on Sunday or sometime next week. I've had very little free time in last few weeks :cry:
Unit tests are a great resource. But there should also be a way of verifying that API works correctly vs. real Salt server instead of just a wiremock.
Recently, there are dozens and dozens of Salt calls added to the library. Most of these are simple, convenience mappings and they probably work just fine today. But what happens if Salt changes - eg. upstream decision to change something? Then maybe "status.diskusage" no longer exists for some reason, or it is replaced with a different return format. As is, we can't detect that. There is no automatic tests to verify the API remains valid (please correct me if I'm wrong here!).
There should be an automatic way to test this entire API vs. a real server.
Comments welcome.