Open keflavich opened 10 years ago
Sounds like a good utility -- though I wonder if @low-sky / @akleroy object to having this in spectral-cube, on the basis of their objections to putting noise estimation spectral-cube.
Fair point; this would more readily fit into a separate package, but the underlying architecture allowing fits-across-arbitrary-dimension needs to be in SpectralCube
, even if its functionality is kept more general.
In the future, we could of course envisage to bring together these various tools into a single repository once they are mature, but I agree that for now it probably makes sense to develop it separately, though I agree some core mathematical functionality may make sense as methods of SpectralCube or in spectral_cube.utils
:+1: On having something like polyfit/lstsq implemented efficiently.
I think holding this in spectral_cube.utils makes a lot of sense, but it does seem like something that's subject to user choice more than data object methods. For example, weighting by a non-uniform noise cube would be one application, particularly in cases where you have some sense of the local noise in gridding dataz.
A common problem in single-dish data reduction is "striping" from drifts in the receiver system (unstable baselines). As long as the cubes are gridded in the scan direction, it could be very useful to have a destriping routine that operates along axes. Basically, this means fitting an n'th-order polynomial along said axes.
The in-memory approach may look something like (pseudo-code; not validated):
and there is an equivalently efficient method for masked data on stackoverflow.
The iterator method along individual "lines of sight" is simpler, but almost certainly slower. Using the linear algebra approach with slabs will probably be the most efficient.