Closed joodicator closed 7 years ago
Alternatively we can drop the testing environments that aren't Py27, Py36 (possibly move over to Py37?) and PyPy to reduce the amount of requests we make.
That sounds reasonable. However, given that Python 3.7 is still in development, I am not sure if it's a good idea to use it in lieu of 3.6. There are currently 5 build jobs on Travis that involve running code (2.7
, 3.3
, 3.4
, pypy
, cover
) - replacing 3.3
and 3.4
with 3.6 and 3.7 would leave the number unchanged, while replacing them with just 3.6 may improve things.
However, I notice that cover
appears to run exactly the same tests as 2.7
, but with the addition of coverage checking. Would it be possible to combine these into a single job to save resources?
Alternatively: test_authentication
could be disabled for all but 2.7 and/or cover, 3.6 and pypy, while as many other versions as desired could be tested for the remaining code.
I don't have any strong opinions on exactly what set of versions should be tested or supported.
Whoops, sorry, I meant 3.6.1, but yeah, I think disabling it for everything but cover would be a good solution too since changes in versions won't really affect the authentication test too much.
I have (hopefully) fixed this, by implementing a mixture of the above ideas, in commit 3f4571d.
I omitted Python 3.6 from the Travis build for now, because Python 3.6 is not preinstalled on Travis images at the moment, so it has to be downloaded, which slows down the build (it resulted in a network timeout in one instance). It can be added once Travis gives it first-class support, which I think will be soon.
See for example this job: https://travis-ci.org/ammaraskar/pyCraft/jobs/217173485, with relevant output below:
These timeouts seem to be due to high network load when multiple tests are run in parallel, and happen infrequently and nondeterministically.
One solution might be to increase the timeout, but I'm not sure if this is satisfactory, as this is only a problem in the testing environment. An alternative solution could be to reduce the number of Travis tests that are run in parallel.