Closed keflavich closed 2 years ago
Fixes #785
I can't produce these failures locally with pytest, but I can with tox...
(python3.9) alien ~/repos/spectral-cube issue785$ pytest --pdb -x spectral_cube/tests/test_projection.py
============================================================================================ test session starts =============================================================================================
platform linux -- Python 3.9.6, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
Running tests with Astropy version 5.0.dev504+ge4405974c.
Running tests in spectral_cube/tests/test_projection.py.
Date: 2022-01-08T21:06:26
Platform: Linux-4.4.0-18362-Microsoft-x86_64-with-glibc2.27
Executable: /home/adam/anaconda3/envs/python3.9/bin/python
Full Python Version:
3.9.6 | packaged by conda-forge | (default, Jul 11 2021, 03:39:48)
[GCC 9.3.0]
encodings: sys: utf-8, locale: UTF-8, filesystem: utf-8
byteorder: little
float info: dig: 15, mant_dig: 15
Package versions:
Numpy: 1.21.1
Scipy: 1.7.0
Matplotlib: 3.4.3
h5py: 3.3.0
Pandas: 1.3.1
Astropy: 5.0.dev504+ge4405974c
regions: 0.6.dev60+g3d469f7
APLpy: 2.0.3
Using Astropy options: remote_data: none.
rootdir: /home/adam/repos/spectral-cube, configfile: setup.cfg
plugins: anyio-3.3.0, asdf-2.8.1, hypothesis-6.14.5, arraydiff-0.3, astropy-header-0.1.2, cov-2.12.1, doctestplus-0.10.1, filter-subpackage-0.1.1, forked-1.3.0, mock-3.6.1, openfiles-0.5.0, remotedata-0.3.2, xdist-2.4.0
collected 127 items
spectral_cube/tests/test_projection.py .......................x..............xxx.......x.............................................................................. [100%]
====================================================================================== 122 passed, 5 xfailed in 10.87s =======================================================================================
vs tox:
(python3.9) alien ~/repos/spectral-cube issue785$ tox -e test -- -x
test inst-nodeps: /home/adam/repos/spectral-cube/.tox/.tmp/package/1/spectral-cube-0.6.1.dev134+g5477283.tar.gz
test installed: astropy==5.0,attrs==21.4.0,casa-formats-io==0.1,cloudpickle==2.0.0,coverage==6.2,dask==2021.12.0,fsspec==2021.11.1,hypothesis==6.35.0,iniconfig==1.1.1,joblib==1.1.0,locket==0.2.1,mock==4.0.3,numpy==1.22.0,packaging==21.3,partd==1.2.0,pluggy==1.0.0,psutil==5.9.0,py==1.11.0,pyerfa==2.0.0.1,pyparsing==3.0.6,pytest==6.2.5,pytest-arraydiff==0.4.0,pytest-astropy==0.9.0,pytest-astropy-header==0.2.0,pytest-cov==3.0.0,pytest-doctestplus==0.11.2,pytest-filter-subpackage==0.1.1,pytest-mock==3.6.1,pytest-openfiles==0.5.0,pytest-remotedata==0.3.3,PyYAML==6.0,radio-beam==0.3.3,scipy==1.7.3,six==1.16.0,sortedcontainers==2.4.0,spectral-cube @ file:///home/adam/repos/spectral-cube/.tox/.tmp/package/1/spectral-cube-0.6.1.dev134%2Bg5477283.tar.gz,toml==0.10.2,tomli==2.0.0,toolz==0.11.2
test run-test-pre: PYTHONHASHSEED='2629565865'
test run-test: commands[0] | pip freeze
astropy==5.0
attrs==21.4.0
casa-formats-io==0.1
cloudpickle==2.0.0
coverage==6.2
dask==2021.12.0
fsspec==2021.11.1
hypothesis==6.35.0
iniconfig==1.1.1
joblib==1.1.0
locket==0.2.1
mock==4.0.3
numpy==1.22.0
packaging==21.3
partd==1.2.0
pluggy==1.0.0
psutil==5.9.0
py==1.11.0
pyerfa==2.0.0.1
pyparsing==3.0.6
pytest==6.2.5
pytest-arraydiff==0.4.0
pytest-astropy==0.9.0
pytest-astropy-header==0.2.0
pytest-cov==3.0.0
pytest-doctestplus==0.11.2
pytest-filter-subpackage==0.1.1
pytest-mock==3.6.1
pytest-openfiles==0.5.0
pytest-remotedata==0.3.3
PyYAML==6.0
radio-beam==0.3.3
scipy==1.7.3
six==1.16.0
sortedcontainers==2.4.0
spectral-cube @ file:///home/adam/repos/spectral-cube/.tox/.tmp/package/1/spectral-cube-0.6.1.dev134%2Bg5477283.tar.gz
toml==0.10.2
tomli==2.0.0
toolz==0.11.2
test run-test: commands[1] | pytest --open-files --pyargs spectral_cube /home/adam/repos/spectral-cube/docs -x
============================================================================================ test session starts =============================================================================================
platform linux -- Python 3.9.6, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
cachedir: .tox/test/.pytest_cache
rootdir: /home/adam/repos/spectral-cube, configfile: setup.cfg
plugins: hypothesis-6.35.0, arraydiff-0.4.0, astropy-header-0.2.0, cov-3.0.0, doctestplus-0.11.2, filter-subpackage-0.1.1, mock-3.6.1, openfiles-0.5.0, remotedata-0.3.3
collected 1687 items
../../spectral_cube/spectral_axis.py . [ 0%]
../../spectral_cube/tests/test_analysis_functions.py ................... [ 1%]
../../spectral_cube/tests/test_casafuncs.py ......ssssssssssssss [ 2%]
../../spectral_cube/tests/test_cube_utils.py ....... [ 2%]
../../spectral_cube/tests/test_dask.py .s..s.......s [ 3%]
../../spectral_cube/tests/test_io.py ...................... [ 4%]
../../spectral_cube/tests/test_masks.py .....................................................................................XX.... [ 10%]
../../spectral_cube/tests/test_moments.py ............................................................................................................................................................ [ 19%]
................................ [ 21%]
../../spectral_cube/tests/test_performance.py ...s.. [ 21%]
../../spectral_cube/tests/test_projection.py .......................x..............xxx.......x...................F
================================================================================================== FAILURES ==================================================================================================
___________________________________________________________________________________ test_1d_slice_reductions[False-cumsum] ___________________________________________________________________________________
method = 'cumsum', data_255_delta = PosixPath('/tmp/pytest-of-adam/pytest-120/test_1d_slice_reductions_False5/255_delta.fits'), use_dask = False
@pytest.mark.parametrize('method',
('min', 'max', 'std', 'mean', 'sum', 'cumsum',
'nansum', 'ptp', 'var'),
)
def test_1d_slice_reductions(method, data_255_delta, use_dask):
cube, data = cube_and_raw(data_255_delta, use_dask=use_dask)
sp = cube[:,0,0]
if hasattr(cube, method):
assert getattr(sp, method)() == getattr(cube, method)(axis=0)[0,0]
else:
> getattr(sp, method)()
../../spectral_cube/tests/test_projection.py:713:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <Quantity [0., 0.] K>, function = <ufunc 'add'>, method = 'accumulate', inputs = (<Quantity [0., 0.] K>,), kwargs = {'axis': 0}, converters = [None], unit = Unit("K"), out = None
arrays = [array([0., 0.])], input_ = array([0., 0.]), converter = None
def __array_ufunc__(self, function, method, *inputs, **kwargs):
"""Wrap numpy ufuncs, taking care of units.
Parameters
----------
function : callable
ufunc to wrap.
method : str
Ufunc method: ``__call__``, ``at``, ``reduce``, etc.
inputs : tuple
Input arrays.
kwargs : keyword arguments
As passed on, with ``out`` containing possible quantity output.
Returns
-------
result : `~astropy.units.Quantity`
Results of the ufunc, with the unit set properly.
"""
# Determine required conversion functions -- to bring the unit of the
# input to that expected (e.g., radian for np.sin), or to get
# consistent units between two inputs (e.g., in np.add) --
# and the unit of the result (or tuple of units for nout > 1).
converters, unit = converters_and_unit(function, method, *inputs)
out = kwargs.get('out', None)
# Avoid loop back by turning any Quantity output into array views.
if out is not None:
# If pre-allocated output is used, check it is suitable.
# This also returns array view, to ensure we don't loop back.
if function.nout == 1:
out = out[0]
out_array = check_output(out, unit, inputs, function=function)
# Ensure output argument remains a tuple.
kwargs['out'] = (out_array,) if function.nout == 1 else out_array
# Same for inputs, but here also convert if necessary.
arrays = []
for input_, converter in zip(inputs, converters):
input_ = getattr(input_, 'value', input_)
arrays.append(converter(input_) if converter else input_)
# Call our superclass's __array_ufunc__
> result = super().__array_ufunc__(function, method, *arrays, **kwargs)
E TypeError: the resolved dtypes are not compatible with add.accumulate. Resolved (dtype('float64'), dtype('float64'), dtype('float64'))
../../.tox/test/lib/python3.9/site-packages/astropy/units/quantity.py:614: TypeError
========================================================================================== short test summary info ===========================================================================================
FAILED ../../spectral_cube/tests/test_projection.py::test_1d_slice_reductions[False-cumsum] - TypeError: the resolved dtypes are not compatible with add.accumulate. Resolved (dtype('float64'), dtype('flo...
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
====================================================================== 1 failed, 410 passed, 18 skipped, 5 xfailed, 2 xpassed in 41.55s ======================================================================
ERROR: InvocationError for command /home/adam/repos/spectral-cube/.tox/test/bin/pytest --open-files --pyargs spectral_cube /home/adam/repos/spectral-cube/docs -x (exited with code 1)
__________________________________________________________________________________________________ summary ___________________________________________________________________________________________________
ERROR: test: commands failed
I'm very annoyed at our CI now. The errors given by CI are incomprehensible. InvocationError
doesn't mean anything.
https://github.com/radio-astro-tools/spectral-cube/runs/4750787277?check_suite_focus=true#step:5:187
OK this is a problem introduced by numpy 1.22 that was not present in numpy 1.21.1.
E TypeError: the resolved dtypes are not compatible with add.accumulate. Resolved (dtype('float64'), dtype('float64'), dtype('float64'))
I think this is a genuine upstream bug, then.
Merging #790 (33624ff) into master (7452dce) will increase coverage by
0.09%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #790 +/- ##
==========================================
+ Coverage 77.90% 78.00% +0.09%
==========================================
Files 24 24
Lines 5826 5847 +21
==========================================
+ Hits 4539 4561 +22
+ Misses 1287 1286 -1
Impacted Files | Coverage Δ | |
---|---|---|
spectral_cube/conftest.py | 92.77% <100.00%> (+0.28%) |
:arrow_up: |
spectral_cube/io/fits.py | 86.45% <100.00%> (+1.11%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update a5ec360...33624ff. Read the comment docs.
https://github.com/astropy/astropy/blob/d2e216ee96edde9a674e99cb61f45ee42a483ebb/astropy/io/fits/tests/test_table.py#L2913-L2915 is suggested instead of my brute-force approach
We assume the default unit is arcseconds