Closed jmangum closed 2 years ago
No, you could never subtract two arrays with different shapes. I'm not sure what's causing this.
the problem is that the max_map
is 800x800 and the noise_map
is 334x400. Why are they different?
Don't know. Will look closer.
Zeroing-in on why max_map
has dimension 800x800 when it should be the same as noise_map
(334x400). By switching back-and-forth between my python 3.7 and python 3.9 environments, have found that the issue appears to be how spectral_cube
treats NaN slices.
For python 3.7, which gets the dimensioning right, I see just before the max_map
calculation:
/Users/jmangum/anaconda3/lib/python3.7/site-packages/spectral_cube/spectral_cube.py:443: RuntimeWarning: All-NaN slice encountered
**kwargs)
brightest_cube.shape = (81, 334, 400)
max_map.shape = (334, 400)
...while for python 3.9 I see the following at the same place in the script:
/Users/jmangum/anaconda3/envs/python39/lib/python3.9/site-packages/spectral_cube/spectral_cube.py:441: RuntimeWarning: All-NaN slice encountered
out = function(self._get_filled_data(fill=fill,
brightest_cube.shape = (81, 800, 800)
max_map.shape = (800, 800)
The max_map
calculation is the following for both:
# Create a copy of the cutoutcube with velocity units
cutoutVcube = cutoutcube.with_spectral_unit(u.km/u.s,
rest_value=brightest_line_frequency,
velocity_convention='optical')
# Use the brightest line to identify the appropriate peak velocities, but ONLY
# from a slab including +/- width:
brightest_cube = cutoutVcube.spectral_slab(vz-velocity_half_range,
vz+velocity_half_range)
# compute various moments & statistics along the spcetral dimension
peak_velocity = brightest_cube.spectral_axis[brightest_cube.argmax(axis=0)]
max_map = peak_amplitude = brightest_cube.max(axis=0)
Guessing that it is the cutoutVcube.spectral_slab
calculation that is not handling NaN well. This does not appear to be quite right, though, as I am using the same version of spectral-cube
in both cases (version 0.6.0).
Found problem. Typo in recently-added try-except change to address deprecation of regions.read_ds9
. Runs now but with another error. Closing this issue.
Upgraded from python 3.7.13 to 3.9.13 with commensurate update of all conda packages. Appears now that quantity is less-tolerant of calculating the ratio of two images when they are not the same dimension:
If this is due to a change in how astropy.units.quantity works, perhaps a simple modification of the
would fix this bug?