Closed orbeckstASU closed 6 years ago
@kain88-de any idea why the tests died? – It appears that we fail to properly start distributed.Client()
https://travis-ci.org/MDAnalysis/pmda/jobs/359726635#L2039
Minor thing: How to tell pytest
to only gather the test_*.py
files and not also the normal code? https://travis-ci.org/MDAnalysis/pmda/jobs/359726635#L1994 – or is it doing something else??
Minor thing: How to tell pytest to only gather the test_*.py files
What you are seeing is the PEP8 test. They only check the formatting of the files. It's best to enforce good code style from the beginning.
I don't know where the timeout came from. Do the tests pass locally?
The tests pass locally.
Will check PEP8, thanks.
The line https://travis-ci.org/MDAnalysis/pmda/jobs/359726637#L2079
self = LocalCluster('tcp://127.0.0.1:44955', workers=0, ncores=0)
looks suspicious to me: does 0 workers and 0 cores mean "take all of them"? Is there a way to just get two workers with 1 process each with the simple setup or do we need to explicitly start scheduler and workers external of the tests?
Also, the PEP8 (?) tests say
pmda/rms.py FAILED
but I don't see any indication what is wrong. The pylint job looks fine.
Also, the PEP8 (?) tests say
pmda/rms.py FAILED
last line of the log. Pylint has nothing to do with the pep8 check.
The failure happens because of dask update. If I rerun a master branch build that succeeded I get the same error you have. Unfortunately I don't have time into looking for a proper fix.
Thanks for all the comments!
-- Oliver Beckstein Assistant Professor of Physics Arizona State University email: oliver.beckstein@asu.edu web: http://becksteinlab.physics.asu.edu
Am Mar 30, 2018 um 06:28 schrieb Max Linke notifications@github.com:
The failure happens because of dask update. If I rerun a master branch build that succeeded I get the same error you have. Unfortunately I don't have time into looking for a proper fix.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.
Need to fix #30 for the tests to make sense.
Apparently it does not work to decorate fixtures with xfail so I added the decorator to the tests and amended the history.
I don't understand why travis still fails, even with the xfails in place.
When I try the test locally in a conda env with MDA 0.17.0 I get
(pmda) yngvi:pmda oliver$ pytest -v pmda/test/test_rms.py
============================= test session starts ==============================
platform darwin -- Python 3.6.5, pytest-3.5.0, py-1.5.3, pluggy-0.6.0 -- /Users/oliver/anaconda3/envs/pmda/bin/python
cachedir: .pytest_cache
rootdir: /Users/oliver/MDAnalysis/pmda, inifile:
plugins: xdist-1.22.2, forked-0.2
collected 3 items
pmda/test/test_rms.py::TestRMSD::test_rmsd PASSED [ 33%]
pmda/test/test_rms.py::TestRMSD::test_rmsd_step xfail [ 66%]
pmda/test/test_rms.py::TestRMSD::test_rmsd_single_frame xfail [100%]
===================== 1 passed, 2 xfailed in 1.66 seconds ======================
so the xfail works locally.
I am restarting the build ... just because I have no idea what to do.
@kain88-de any idea why the xfail decorator is not working on travis? https://travis-ci.org/MDAnalysis/pmda/jobs/364000696#L2064
did you compare pytest versions
Same versions:
we install a development build of mdanalysis that identifies correctly but doesn't have the DCD fix applied yet. So Travis has version 0.18.0 which doesn't fail.
I think I understand the problem: Travis uses the MDA 0.17.1-dev from @kain88-de 's channel and I don't think that those builds already have the fix for MDAnalysis/mdanalysis#1819 (because they were updated last time 2 months ago). BUT: these builds are versioned 0.17.1-dev and so the xfail considers them as tests that should pass.
My local tests passed with MDAnalysis 0.17.0 and the xfail behaved properly. Should I switch pmda tests to using the the standard MDA distribution or is there another reason that we are using the develop builds?
I came to the same exclusion but took 3 minutes longer to post...
Merging #28 into master will increase coverage by
4.9%
. The diff coverage is100%
.
@@ Coverage Diff @@
## master #28 +/- ##
=========================================
+ Coverage 93.37% 98.27% +4.9%
=========================================
Files 7 8 +1
Lines 317 348 +31
=========================================
+ Hits 296 342 +46
+ Misses 21 6 -15
Impacted Files | Coverage Δ | |
---|---|---|
pmda/test/test_rms.py | 100% <100%> (ø) |
|
pmda/rms.py | 100% <100%> (+100%) |
:arrow_up: |
pmda/test/test_contacts.py | 97.67% <100%> (-1.17%) |
:arrow_down: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 69e84d2...78aa84a. Read the comment docs.
tests pass now. I also merged now that we test against the stable version of mdanalysis
FIxes #25
Changes made in this Pull Request:
PR Checklist