spacetelescope / jwst

Python library for science observations from the James Webb Space Telescope
https://jwst-pipeline.readthedocs.io/en/latest/
Other
546 stars 159 forks source link

pytest.mark.filterwarnings decorator is ignored in regtests #8318

Open stscijgbot-jp opened 4 months ago

stscijgbot-jp commented 4 months ago

Issue JP-3557 was created on JIRA by Ned Molter:

Expected behavior: Adding @⁣pytest.mark.filterwarnings("error") into any regression test that is known to raise a warning should cause that test to fail

Actual behavior: Adding @⁣pytest.mark.filterwarnings("error") into any regression test that is known to raise a warning has no effect.

This behavior is only seen in regression tests in the regtest/ subdirectory, which seems to imply that something inside that directory's conftest.py is responsible; unit tests in other directories do not display this behavior.

Warnings can still be escalated to become errors in the regtests using the pytest -W error flag, and by modifying the pytest ini_options] in pyproject.toml

stscijgbot-jp commented 4 months ago

Comment by Brett Graham on JIRA:

Ned Molter do you recall which test(s) didn't take the filterwarnings mark?

I just tried adding:

@pytest.mark.filterwarnings("error")
def test_warning():
    warnings.warn("Warning!", UserWarning)```

to jwst/regtest/test_fgs_guider.py and got the expected result
```bash
FAILED jwst/regtest/test_fgs_guider.py::test_warning - UserWarning: Warning!
stscijgbot-jp commented 4 months ago

Comment by Ned Molter on JIRA:

I believe I was trying this inside test_nirspec_mos_spec2.py and putting the warning I hoped to catch in one of the steps of the spec2 pipeline.  Did you try putting the warnings.warn() in a separate script?  The other difference I see is that the tests I used were marked as @⁣pytest.mark.bigdata, and the warning was raised within a fixture that was passed to the test.

stscijgbot-jp commented 4 months ago

Comment by Ned Molter on JIRA:

When I have a chance, I can try to make a minimum working example - but it sounds like you might be doing that right now already

stscijgbot-jp commented 4 months ago

Comment by Brett Graham on JIRA:

Thanks! A minimum example would be great. I tried adding the same test to test_nirspec_mos_spec2.py with the same result. One combination I found that didn't convert the warning to an error was to add a warning to the fixture ("run_pipeline") and a filterwarnings mark on the fixture. This didn't raise an error but I'm not sure if that's expected (I'm not sure if the mark works the same way on the fixture instead of a test). However if I add the warning to the fixture and the mark to the test it correctly produces an error.

stscijgbot-jp commented 4 months ago

Comment by Ned Molter on JIRA:

That is strange.  Maybe it's somehow only happening to me locally.  I'll look into it more

stscijgbot-jp commented 4 months ago

Comment by Ned Molter on JIRA:

I am not able to reproduce this anymore.  I made a new script inside lib that raised a runtime warning due to divide by zero.  I then wrote a regtest that calls that script inside a fixture that includes calls to jail and rtdata_module.  That fixture is then used by a test function that uses pytest.mark.bigdata and pytest.mark.parametrize.  I think this setup is identical to most tests in our suite.  Playing around with setting warning filters in various places, I've convinced myself that everything works as intended.  Here are some notes:

In any case, going to close this until and unless someone else can reproduce the behavior I thought I saw.