Closed zenhack closed 6 years ago
I think we should just move the client library tests to tests/integration/
. That will also help us run all our integration tests in a separate Travis build on parallel with the rest of the tests.
Makes sense to me. Should we do the directory move as a separate pr?
Yeah, that should be a separate PR.
Sounds good.
Ug, we can't actually do what I described above, because the client tests do a bunch of work registering switches using the real hardware drivers -- so disabling dry-run would cause a bunch of problems there.
A small part of me is tempted to just make obmd a hard-dependency for running the tests, and get remove the dry_run decorator from obm_enable/obm_disable. I have an instinctive aversion to this, but it's also not like there's a lot of setup involved -- you just need the obmd executable in your $PATH.
because the client tests do a bunch of work registering switches using the real hardware drivers -- so disabling dry-run would cause a bunch of problems there.
Sorry, I don't understand. We don't use dry_run for switches. I think only the headnode code and ipmi code uses that. And, the client library tests work just fine with dry_run disabled.
On a slightly related note, I'll try to refactor some of the client library code once my other 2 PRs are in (things I have in mind: load a minimal config file, register mostly mock stuff, mock authentication, and populate the db by directly inserting into the db).
Quoting Naved Ansari (2018-05-03 16:48:08)
Sorry, I don't understand. We don't use dry_run for switches. I think only the headnode code and ipmi code uses that. And, the client library tests work just fine with dry_run disabled.
Ah, dug into it a bit further; I saw both the presence of deferred.apply_networking() and creation of nexus/powerconnect/brocade switches and made an assumption, but we never actually do any real networking operations in those tests, except for with the mock switch -- so you're right, it's fine without dry run.
Opinions on just requiring an installed obmd for those tests to work?
On a slightly related note, I'll try to refactor some of the client library code once my other 2 PRs are in (things I have in mind:
[...]
and populate the db by directly inserting into the db).
So, as much as this feels like it would be a bit cleaner, going through the HTTP api is something that right now only the client tests exercise thoroughly actually going through the HTTP api. I'd love to factor stuff out and clean stuff up, but I want to be careful not to lose the value we're getting from what it's exercising. Everything else you suggest sounds great.
Opinions on just requiring an installed obmd for those tests to work?
I am fine with it since we agree on calling the client lib tests as integration tests, it makes sense to make sure to test we can follow redirects etc.
Btw, I was looking to see where we currently test the rest of the obm calls (even if with a mock driver) and I couldn't find it in tests/unit/api/main.py
(it was only testing, power_cycle to test dry_run). Right now, only the client library is actually testing those APIs. Do you think we should have been testing those in tests/unit/api/main.py
with a mock driver. And if you were to add it now with the new OBMd stuff, I think we would have to use the dry run decorator in more places then.
The relevant coverage in the client tests for the obm calls is pretty good; it would have been good to have that stuff tested in api.py, but I don't know that I think it makes sense to add stuff there, at least while everything is in flux. In terms of actual coverage, I'm pretty comfortable with what's in the client integration tests.
@naved001, I think this is ready for review; I've removed the WIP tag. The history is rather messy, and I would be amenable to spending some time cleaning it up, but the diff isn't too bad I think. Note that this includes the patch from #1009, since that is required for the tests; #1009 should be merged first (as well as #1002).
I ended up backing out the change to show_console, since due to the change in the api (streaming vs. just returning a current buffer), it's a bit bigger change than the others -- I thought it made sense to do it as a separate pr, so my plan is to do that next. After that, all that will be left will be to delete the old obm support.
if you rebase this, it would make it easier to review since #1002 is now merged.
@naved001, done.
The travis failures are due to the fact that https://github.com/CCI-MOC/obmd/pull/18 hasn't been merged yet; once that lands they should clear themselves up.
@naved001, your comments have been addressed.
The pattern of failures looks like it's because we're running two tests at a time, and they're fighting over the port number. It passes locally if I don't run the tests in parallel, but it would be nice to be able to do that, so I'd prefer not to disable it. I wish there were a nice way of figuring out which of pytest's "worker process" we're running in.
The sqlite errors could be because we are running the client tests in parallel and that is taken care of in #1008
Looking around a bit it seems there are a couple relevant environment variables. I'll tweak it to do the right thing if I can.
By taken care of I mean, we only run unit tests with sqlite, and integration tests are run serially. So you may not have to worry about fixing anything here.
Edit: besides the cli integration tests.
Ok -- I think I have a solution anyway, but maybe we'll let #1008 land and then I can submit that as a further improvement.
@zenhack I just merged #1008, so we're all set there
Fixed the typos and rebased.
Some real test failures; working on fixing.
I am at a little bit of a loss re: the test failures. I've got some other work I need to focus on today, but will dig back in later in the week.
@naved, @Izhmash, travis is passing now.
Fixed the typo and merged master back in.
I am not sure why can't we start obmd? I thought I did it the right way in #1019
@naved, see my most recent commit message; hopefully that will fix it.
@naved001 I'll let you merge this one since your +1 is a bit outdated now.
@zenhack thanks for the fix. Merging
This is built on top of the work in #1002. See the commit message for d1e7ec1.
One nagging thing that makes testing a bit annoying is that when
dry_run
is enabled,enable_obm
is a no-op, which means calls like power_off etc. fail even if you've correctly enabled the obm. In the most recent commit I deal with this by just shoving a bogus token in the field, which I think for the self-contained tests s a reasonable strategy.For the client tests though, because they actually spawn a whole web-server, it's a bit more of a pain. Probably the easiest thing to do there is just bite the bullet and say you need obmd available to run those tests as well (the same as with tests/integraton/obmd.py). For better or worse, they really are integration tests, and they have caught a few things not directly related to client library. So while the current desgn isn't super clean, it has value, and maybe the thing to do is just embrace the fact that they're integraton tests and go with it? I'd like to have a clear stance on the issue.