Open smarie opened 5 years ago
i am definitively onboard for making this a contraints solver, i beleive for sound implemntation we need to destroy the test protocol hooks as currently designed (yay)
while we are at it there should be a layer weaved in to support communication about setup/teardown dependencies for xdist as its current scheduling mechanisms leave much to be desired
Introduction
To quote https://github.com/pytest-dev/pytest/issues/5054#issuecomment-488304538
Currently as you all know, pytest post-processes the order of test items in the
pytest_collection_modifyitems
hook. The purpose is to reach some kind of "optimality". There are currently many open tickets in pytest-dev about these ordering issues. My personal feeling is that we will not solve each of these problems separately, and that there is a need for a single place where we can discuss what "optimal" means, and what is the directionpytest
will take on that topic.Current Status
1- What is "optimal" ?
a) Current implementation
Even if PR #3551 makes it way to solving the above issues (thanks @ceridwen !), a few other issues go beyong the current definition of "optimal":
b) Additional need 1: "priorities"
The first issue with current approach is that inside a given scope, ordering may be counter intuitive especially when there are multiple "best orders". Some comments in some tickets (https://github.com/pytest-dev/pytest/issues/2846#issuecomment-339603786, https://github.com/pytest-dev/pytest/issues/2846#issuecomment-380653238) do not agree with it.
A new ticket #3393 has been opened and led to request an updated definition of "optimal": adding a "priority" argument. @Sup3rGeo proposed a plugin to handle this:
pytest-param-priority
.My personal feeling: "priority" is a very technical term that most users will not understand properly. Whereas a notion of "setup/teardown cost", that users can express in seconds or in any other unit of their choice, could be easier to document and understand.
c) Additional need 2: "constraints"
4892 raises the question of "shared resources" between fixtures. Part of the OP's need can be solved by setting the fixtures with the highest cost as "high priority", but the notion of "shared resource" is still an additional need: that two fixtures have an "interlock" between their setup/teardown. (one can not be setup if the other is setup).
2- Other desirable features
a) Explicit ordering
pytest_reorder
proposes an additional commandline option to reorder tests based on their node ids, or based on a custom regex matching order. This allows users to customize the order pretty much the way they wish.pytest-ordering
proposes to reorder tests based on marks. Not sure that this applies to fixtures also.b) Disabling order optimization
As this topic grows it seems more and more appropriate to be able to disable any kind of order optimization, just to be able to understand wher an order comes from. I suggested in #5054 and implemented in
pytest-cases
a commandline switch to skip all reordering done bypytest
and plugins.c) Readability / maintainability
To quote https://github.com/pytest-dev/pytest/issues/3161#issuecomment-372797418
This raises the point about readability/maintainability of the chosen algorithm, whatever it is.
d) Support for parallelism
pytest-xdist
allows users to parallelize tests. I expect that the "optimal" scheduling will therefore have to be completely modified in presence of parallelism.Now what ?
From here, the debate is open:
pytest
or just some of it? If not, where is the best place to work on this topic ?Your ideas ?