Closed Peque closed 6 years ago
@The-Compiler @nicoddemus Should I ping somebody for review, or just wait? :innocent:
I'd expect @davehunt to review this once he has the time.
You might want to add a config.addinivalue_line('markers', 'repeat: Repeat this test a number of times')
or so to pytest_configure
, so the marker is recognized when running pytest with --strict
.
@davehunt @The-Compiler Pushed a fixup with that change (and test): 07844220841a379e0f3ccebea196594bf5bc21bb
Will squash before merging if you think it is okay now.
@davehunt I think it makes more sense if the command line option does not take precedent over the marker (i.e.: I would consider the marker to be more specific and constrained to a single test, while the command line option could be more general and used for the whole test suite). However, perhaps it would make sense to multiply them (i.e.: if you mark .repeat(3)
and then --count 2
, it could run that test function 6 times).
@Peque on modern pytest you can do marker = metafunc.definition.get_closest_makrer(name="parametrize")
on pytest < 3.6
you are unfortunately out of luck and need to use the current code
I think it makes more sense if the command line option does not take precedent over the marker (i.e.: I would consider the marker to be more specific and constrained to a single test, while the command line option could be more general and used for the whole test suite). However, perhaps it would make sense to multiply them (i.e.: if you mark .repeat(3) and then --count 2, it could run that test function 6 times).
An example of a similar plugin is pytest-rerunfailures, where there exists a marker for indicating that a test can fail up to a maximum number of times. In that case, you can set a blanket rerun max from the command line that will affect all tests. As it's on the command line and explicit, it takes precedent over any markers. I don't mind if you decide to make the marker the priority, I just think it may surprise some users. I would definitely vote against multiplying the two though, as this will almost certainly surprise users.
@davehunt Not that I have a strong opinion on that, so tell me if you want me to change it and I will. If it is okay as-is, I will squash and push again. :wink:
@RonnyPfannschmidt Thanks for the hint. The current test suite still tests against pytest
2, though. As this is not my project I would rather not change that. :innocent:
Not that I have a strong opinion on that, so tell me if you want me to change it and I will. If it is okay as-is, I will squash and push again.
No strong opinion here either, so let's leave it as it is. Thanks!
@davehunt Squashed the fixup and pushed the changes. Ready for merge. :blush:
Thanks to you too for reviewing!
Thanks @Peque I'll merge this and prepare a release. If you're interested in contributing more, I think we can drop pytest 2.x from the tox.ini
and .travis.yml
and just run against the latest pytest release.
Don't forget to update the versions in the README:
Python 2.6, 2.7, 3.2, 3.3, 3.4, 3.5 or PyPy
Thanks @hugovk would you be interested in submitting a pull request for that?
Done, please see PR https://github.com/pytest-dev/pytest-repeat/pull/26.
Fixes https://github.com/pytest-dev/pytest-repeat/issues/16.
Further changes (to make CI happy):
pytest-xdist
dependency fromtox.ini
since it was not being used and it requirespytest>=3
(conflicts with the currentpytest<3
tests).Python 3.7 has not been added because it is currently not available in Travis.