Closed dgard1981 closed 1 year ago
Looks like, to me, that pytest_assertion_pass()
gets called with this code:
def test_two(check):
with check:
assert 1 == 1
with check:
assert 2 == 2
But NOT with this code:
def test_two(check):
check.equal(1, 1)
check.equal(2, 2)
That makes sense, since the helper methods such as check.equal()
, check.not_equal()
etc don't call assert.
This is for speed reasons.
I recommend using the with check:
usage of pytest-check
if you really need pytest_assertion_pass()
to work.
I'd say this "works as designed", but I know that's frustrating. If you have any ideas to make this work but not slow things down, let me know.
I recommend using the
with check:
usage ofpytest-check
if you really needpytest_assertion_pass()
to work.
Thanks for the tip, that does seem to do the trick. May I suggest adding a note about the pytest_assertion_pass
to your README to make that use case more obvious, just in case others try to do the same thing?
If you have any ideas to make this work but not slow things down, let me know.
I'm definitely not a Python expert, so don't really know if this would make the plugin too slow, but would I'd love to see is a custom hook that is called for every check (pass or fail) that give the same information as is available through the pytest_assertion_pass
hook.
If this hook was called regardless of result, it would give the plugin a USP over just using assert
with pytest
because there is no out of the box hook for an assertion that does not pass.
I've been having a play with this over the last couple of days, and I've managed to create my own plugin that adds a fixture called check_expanded
, which returns check
but with the __exit__
method overridden to call a new hook.
It's hacky, and the hook would be better implement as part of pytest-check
, but if it's still not something you're interested in then I'm happy to continue using my own plugin.
pytest_check_expanded_project.zip
A simple test shows that all of the functionality of pytest-check
that I require still exists, but quite obviously it won't be suitable for everyone.
def test_something(check, check_expanded):
with check_expanded:
assert 1 == 1 # pass
with check_expanded:
assert 1 == 2 # fail
with check_expanded:
assert 2 == 2 # pass after previous fail
with check_expanded.raises(ZeroDivisionError):
1/0 # ignored because raises
with check_expanded:
assert 1 == 1 # pass after raises
1/0 # Error
assert 1 == 1 # assertion not reached due to previous error, as expected
The current code:
def test_example(check):
a = 1
b = 2
c = [2, 4, 6]
with check:
assert a > b, "a < b"
with check:
assert d == a, "a == d"
with check:
assert b < a, "b < a"
with check:
assert b not in c, "make sure 2 isn't in the list"
has the output:
======================================================================================================= FAILURES ========================================================================================================
_____________________________________________________________________________________________________ test_example ______________________________________________________________________________________________________
FAILURE: a < b
assert 1 > 2
test_example.py:7 in test_example() -> with check:
test_example.py:8 in test_example -> assert a > b, "a < b"
AssertionError: a < b
assert 1 > 2
------------------------------------------------------------
Failed Checks: 1
------------------------------------------------------------
check = <pytest_check.context_manager.CheckContextManager object at 0x106fe2e30>
def test_example(check):
a = 1
b = 2
c = [2, 4, 6]
with check:
assert a > b, "a < b"
with check:
> assert d == a, "a == d"
E NameError: name 'd' is not defined
tests/test_example.py:10: NameError
================================================================================================ short test summary info ================================================================================================
FAILED tests/test_example.py::test_example
=================================================================================================== 1 failed in 0.02s ===================================================================================================
It does not look like the code is working properly. I am using: pytest 7.4.3 pytest_check 2.2.2
Any insights would be appreciated. Thanks!
After a test run there is a requirement to print a table showing the number of passed assertions (checks). At the moment I am using the
pytest_assertion_pass
hook to count assertions that pass, and then thepytest_terminal_summary
hook to display tabular data about the passed assertions.However, when using
pytest-check
, thepytest_assertion_pass
hook is not fired and the tabular data displayed by thepytest_terminal_summary
hook in incorrect.I'm guessing this is because
pytest-check
doesn't actually use assertions, Is there a hook that is fired for each check that would allow me to count passed checks?