Closed e-koch closed 2 years ago
Merging #816 (9abaad6) into master (f92fe7d) will increase coverage by
0.01%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #816 +/- ##
==========================================
+ Coverage 77.94% 77.95% +0.01%
==========================================
Files 24 24
Lines 5853 5856 +3
==========================================
+ Hits 4562 4565 +3
Misses 1291 1291
Impacted Files | Coverage Δ | |
---|---|---|
spectral_cube/dask_spectral_cube.py | 85.01% <100.00%> (+0.07%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update f92fe7d...9abaad6. Read the comment docs.
This fixes #815.
Spectral interpolation changes the output cube dimensions, so
DaskSpectralCube.spectral_interpolation
has a direct call to themap_blocks
function where we specify the output chunk size. If the array shape is not an exact multiple of the chunk size, a tuple of chunk sizes needs to be given tomap_blocks
to reflect that same chunks will be smaller or otherwise vary, and that information is needed to inform file writing routines (zarr, to FITS) before they create the output files. This causes the mismatched shape errors in #815.In the original cube, this is stored in the dask array as
cube._data.chunks
, and we're only changing the 0th spectral size for spectral interpolation.This may come up in other places for routines that change the original shape of the cube on output. I think
spectral_interpolate
is the only place where this currently needs to be handled here (downsampling uses a dask built in).