Closed ashleyblackmore closed 10 years ago
I believe I have the same errors as well. As far as plugin tests go -- they are up-to-date. I have landed a couple commits to rewrite the tests and basic plugins. However, I find that running multiple plugin tests will cause failure, whereas running individual test will cause no harm.
Can you confirm that if you run nosetests test_alive.py
and then nosetests test_hsts.py
the test will pass? You don't need to have minion services running when you run the plugin tests, JFYI.
For the rest of the tests (mostly testing the backend's views), I am updating them. See https://github.com/mozilla/minion-backend/issues/184 (partly because I no longer can comprehend half of the tests I wrote back then, and when I update the tests I do it iteratively.... so hopefully I won't accidentally remove "failures")
I think I can land the patch for #184 this weekend. The first issue will be fixed as well. Thanks.
I have to edit the original post by placing the error output into a pastebin because the length makes navigation a bit hard.
(env) ~/minion-backend/tests/functional/plugins [master|✚ 1]
14:13 $ nosetests test_alive.py
..
----------------------------------------------------------------------
Ran 2 tests in 3.406s
OK
(env) ~/minion-backend/tests/functional/plugins [master|✚ 1]
14:13 $ nosetests test_hsts.py
..
----------------------------------------------------------------------
Ran 2 tests in 6.892s
OK
This is one of the latest builds. In my CI script I commented out running plugin tests so only functiona/views/*.py ran.
https://drone.io/github.com/mozilla/minion-backend/132
One failure but the failure looks innocent. Probably has to do with how the test is handle on the CI service (since I can run and pass all tests on my instance).
I will take a look at the plugin tests tomorrow.
Still getting a lot of failures, though I didn't modify my tests to run only the functional tests - will take a look at it later. I would like to re-write some of the logic in the header checks, which will mean fixing the tests too - that is why I am interested in getting these running. :)
Ha, I can still see that results are the same as yours though:
Mine:
Ran 113 tests in 25.510s
FAILED (errors=77, failures=36)
drone.io:
Ran 77 tests in 8.241s
FAILED (failures=1)
Nice work yeukhon!
Okay. I got time tonight to figure out.
The regression happens because in the old revision each plugin test
had a different endpoint. Since now they (almost all of the tests)
still share the same Flask object and use the same endpoint ``/tests``,
when nosetests scans for all the test_app object, only one of the endpoint
is chosen. Sometimes if we are lucky on tearDown some are updated. But
in general, only one test file's actual endpoint is in the memory.
The solution is to pull all the Flask-related code into a separate python
file according to test. They now organized under servers/ directory.
Here is the build only testing plugins https://drone.io/github.com/mozilla/minion-backend/135
Green.
I will look into the failure of views (the one most annoying thing) LoL
thanks.
Hey guys,
Getting a lot of failures on nosetests here. Am I doing something wrong? Are some of the tests outdated?
Test output: http://paste.ubuntu.com/6842758/ (edit by yeukhon)