Closed keflavich closed 1 year ago
Immediate test failures:
py39-test: exit 2 (0.00 seconds) .tmp/py39-test> '!cov-!noopenfiles:' pytest --open-files --pyargs spectral_cube /home/runner/work/spectral-cube/spectral-cube/docs
.pkg: _exit> python /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: exit None (0.00 seconds) /home/runner/work/spectral-cube/spectral-cube> python /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/site-packages/pyproject_api/_backend.py True setuptools.build_meta pid=1745
py39-test: FAIL code 2 (25.98=setup[25.58]+cmd[0.39,0.00] seconds)
evaluation failed :( (26.08 seconds)
Error: Process completed with exit code 2.
No idea what this means
Actually the !cov-!bla
syntax now fails for astropy as well (astropy/astropy#14139), e.g.
https://github.com/astropy/astropy/actions/runs/3733576067/jobs/6334552564
so this was in fact broken with a more recent tox update pulled in with 4.0.13
tox-dev/tox#2747.
https://github.com/radio-astro-tools/spectral-cube/actions/runs/4374807279/jobs/7654700456#step:5:1516 we have real failures in some tests because of unimplemented floor_divide? I hope this just means we need to bump version numbers
Patch coverage: 80.00
% and project coverage change: +2.51
:tada:
Comparison is base (
d004d79
) 77.36% compared to head (dad37a2
) 79.88%.
:mega: This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
There's at least one genuine bug here introduced by astropy upstream. This test:
https://github.com/radio-astro-tools/spectral-cube/blob/master/spectral_cube/tests/test_projection.py#L137
fails on dev but not on older versions. The sum p2 = p + p
does not retain the beam from p
I don't see any sign of this breaking change in astropy's changelog. The only way I can think to track it down is with bisection, which I'm really bad at and can't do on my personal computer.
Maybe this is our breaking change? https://github.com/astropy/astropy/commit/c6ff27ddcc8e0c899c5a5999ffc4c9d2be9bf4b8
No, that seems like it's not the problem; the default is to pass on info except for a few cases that don't apply to these tests.
I had to do some test refactoring to get tests to run locally at all.
right, so, it looks like noviz works, but 'all' doesn't.
OK, dev passes on its own, so it's not dev. It's some interaction between the viz stuff and the dev stuff, but I'm not convinced the viz stuff works at all. Split up the tests further.
So getting rid of the 'all' mode has solved my local woes - I can now test everything in a grid locally. This seems also to have "fixed" tests remotely. But I think this makes no sense, so I'm worried I've done something that I don't understand.
Dev and viz independently cause failures. The dev failures originate somewhere in quantity, but there's no difference in astropy quantity between 5.2.1 and 5.3.dev, so I'm stuck.
I can't test dev locally because:
Traceback (most recent call last):
File "/private/var/folders/k_/7qh4l0nn72b7qgq15pkd4hw40000gt/T/pip-build-env-1rq0a8qx/overlay/lib/python3.9/site-packages/setuptools/_distutils/unixccompiler.py", line 185, in _compile
self.spawn(compiler_so + cc_args + [src, '-o', obj] + extra_postargs)
File "/private/var/folders/k_/7qh4l0nn72b7qgq15pkd4hw40000gt/T/pip-build-env-1rq0a8qx/overlay/lib/python3.9/site-packages/setuptools/_distutils/ccompiler.py", line 1041, in spawn
spawn(cmd, dry_run=self.dry_run, **kwargs)
File "/private/var/folders/k_/7qh4l0nn72b7qgq15pkd4hw40000gt/T/pip-build-env-1rq0a8qx/overlay/lib/python3.9/site-packages/setuptools/_distutils/spawn.py", line 70, in spawn
raise DistutilsExecError(
distutils.errors.DistutilsExecError: command '/usr/bin/clang' failed with exit code 1
ERROR: Failed building wheel for numcodecs
ERROR: Could not build wheels for numcodecs, which is required to install pyproject.toml-based projects
I'm at a loss with this; I've googled and found similar issues, but no fixes - the issues just end with 'oh ok we solved it'
the viz tests are amusing: https://github.com/radio-astro-tools/spectral-cube/actions/runs/4384619998/jobs/7676312443#step:5:195
../../spectral_cube/tests/test_subcubes.py ..ssssssssssssssssssss [ 98%]
Fatal Python error: Aborted
I think this is an indication that the tests crash completely as soon as test_visualization.py
, the next test code alphabetically, is initiated. I'm guessing the failure happens on the matplotlib import step.
I finally tracked down one error: https://github.com/astropy/astropy/issues/14514.
ok last (?) open issue: https://github.com/radio-astro-tools/spectral-cube/actions/runs/4387984028/jobs/7685493794#step:5:221
=================================== FAILURES ===================================
___ TestSpectralCube.test_apply_everywhere_floordivide[True-floordiv-value0] ___
self = <spectral_cube.tests.test_spectral_cube.TestSpectralCube object at 0x7fcd29aec5e0>
operation = <built-in function floordiv>, value = <Quantity 0.5 K>
data_advs = PosixPath('/tmp/pytest-of-runner/pytest-0/test_apply_everywhere_floordiv1/advs.fits')
use_dask = True
@pytest.mark.parametrize(('operation', 'value'),
((operator.div if hasattr(operator,'div') else operator.floordiv, 0.5*u.K),))
def test_apply_everywhere_floordivide(self, operation, value, data_advs, use_dask):
c1, d1 = cube_and_raw(data_advs, use_dask=use_dask)
# floordiv doesn't work, which is why it's NotImplemented
with pytest.raises(u.UnitConversionError):
> c1o = c1._apply_everywhere(operation, value)
../../spectral_cube/tests/test_spectral_cube.py:367:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../spectral_cube/utils.py:49: in wrapper
return function(self, *args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = DaskSpectralCube with shape=(2, 3, 4) and unit=K and chunk size (2, 3, 4):
n_x: 4 type_x: RA---SIN unit_x: deg...094 deg: 29.935218 deg
n_s: 2 type_s: VOPT unit_s: km / s range: -321.[215](https://github.com/radio-astro-tools/spectral-cube/actions/runs/4387984028/jobs/7685493794#step:5:216) km / s: -319.926 km / s
function = <built-in function floordiv>, check_units = True
args = (<Quantity 0.5 K>,), test_result = <Quantity [[[2.]]]>
new_unit = Unit(dimensionless)
@warn_slow
def _apply_everywhere(self, function, *args, check_units=True):
"""
Return a new cube with ``function`` applied to all pixels
Private because this doesn't have an obvious and easy-to-use API
Parameters
----------
function : function
An operator that takes the data (self) and any number of additional
arguments
check_units : bool
When doing the initial test before running the full operation,
should units be included on the 'fake' test quantity? This is
specifically added as an option to enable using the subtraction and
addition operators without checking unit compatibility here because
they _already_ enforce unit compatibility.
Examples
--------
>>> newcube = cube.apply_everywhere(np.add, 0.5*u.Jy)
"""
try:
if check_units:
test_result = function(np.ones([1,1,1])*self.unit, *args)
new_unit = test_result.unit
else:
test_result = function(np.ones([1,1,1]), *args)
new_unit = self.unit
# First, check that function returns same # of dims?
assert test_result.ndim == 3,"Output is not 3-dimensional"
except Exception as ex:
raise AssertionError("Function could not be applied to a simple "
"cube. The error was: {0}".format(ex))
# We don't need to convert to a quantity here because the shape check
data_in = self._get_filled_data(fill=self._fill_value)
> data = function(data_in, *args)
E TypeError: operand type(s) all returned NotImplemented from __array_ufunc__(<ufunc 'floor_divide'>, '__call__', dask.array<FilledArrayHandler 017ac2fc-5bf3-4d25-bf15, shape=(2, 3, 4), dtype=>f8, chunksize=(2, 3, 4), chunktype=numpy.ndarray>, <Quantity 0.5 K>): 'Array', 'Quantity'
All of the relevant tests have been stuck yellow for hours? Did I overuse our minutes or something?
no... I tihnk I introduced an infinite loop 🤦
Got dev to go green, finally. Have a yt error:
self = <spectral_cube.tests.test_spectral_cube.TestYt object at 0x2ae7055d5d60>, request = <SubRequest 'setup_method_fixture' for <Function test_yt[False]>>
data_adv = PosixPath('/tmp/pytest-of-adamginsburg/pytest-12/test_yt_False_0/adv.fits'), use_dask = False
@pytest.fixture(autouse=True)
def setup_method_fixture(self, request, data_adv, use_dask):
print("HERE")
self.cube = SpectralCube.read(data_adv, use_dask=use_dask)
# Without any special arguments
print(self.cube)
print(self.cube.to_yt)
> self.ytc1 = self.cube.to_yt()
spectral_cube/tests/test_spectral_cube.py:899:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
spectral_cube/spectral_cube.py:2371: in to_yt
ds = FITSDataset(hdu, nprocs=nprocs,
/orange/adamginsburg/miniconda3/envs/python39/lib/python3.9/site-packages/yt/data_objects/static_output.py:192: in __new__
obj.__init__(filename, *args, **kwargs)
/orange/adamginsburg/miniconda3/envs/python39/lib/python3.9/site-packages/yt/frontends/fits/data_structures.py:766: in __init__
super().__init__(
/orange/adamginsburg/miniconda3/envs/python39/lib/python3.9/site-packages/yt/frontends/fits/data_structures.py:384: in __init__
Dataset.__init__(
/orange/adamginsburg/miniconda3/envs/python39/lib/python3.9/site-packages/yt/data_objects/static_output.py:273: in __init__
_ds_store.check_ds(self)
/orange/adamginsburg/miniconda3/envs/python39/lib/python3.9/site-packages/yt/utilities/parameter_file_storage.py:135: in check_ds
hash = ds._hash()
/orange/adamginsburg/miniconda3/envs/python39/lib/python3.9/site-packages/yt/data_objects/static_output.py:391: in _hash
s = f"{self.basename};{self.current_time};{self.unique_identifier}"
/orange/adamginsburg/miniconda3/envs/python39/lib/python3.9/functools.py:993: in __get__
val = self.func(instance)
/orange/adamginsburg/miniconda3/envs/python39/lib/python3.9/site-packages/yt/frontends/fits/data_structures.py:439: in unique_identifier
return super().unique_identifier
/orange/adamginsburg/miniconda3/envs/python39/lib/python3.9/functools.py:993: in __get__
val = self.func(instance)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = SpectralCubeFITSDataset: /blue/adamginsburg/adamginsburg/repos/spectral-cube/InMemoryFITSFile_d26b1c73bcb149e9a9fc50a3bacda733
@cached_property
def unique_identifier(self) -> str:
> retv = int(os.stat(self.parameter_filename)[ST_CTIME])
E FileNotFoundError: [Errno 2] No such file or directory: '/blue/adamginsburg/adamginsburg/repos/spectral-cube/InMemoryFITSFile_d26b1c73bcb149e9a9fc50a3bacda733'
/orange/adamginsburg/miniconda3/envs/python39/lib/python3.9/site-packages/yt/data_objects/static_output.py:315: FileNotFoundError
Note that this depends on https://github.com/astropy/reproject/pull/349
ah crud. Everything is not fine.
OK so something changed in astropy-dev in the last 4 days that broke this. I'll bisect this time.
Broken by https://github.com/astropy/astropy/pull/14508. Can't stay ahead of astropy!
In the last commit, I was trying to replicate this line: https://github.com/larrybradley/jdaviz/blob/2ad11c120a7317c01c0ee6957c19c494a54e7428/tox.ini#L62
what did I get wrong?
And make sure tests are still passing in prep for a new release (0.6.1)