Open rfriesen opened 7 years ago
Have you looked at the uncertainty in other quantities? Hopefully those are higher and you can flag out these points on that basis.
It would be helpful to look at some of the individual spectra too. It might be a pyspeckit issue, or it might be a problem with the underlying LMfit approach.
The uncertainties in the other quantities aren't different enough to identify poor fits. Looking at the spectra and where the pixels are in the maps, these values are showing up where the S/N in the (2,2) line is very low:
Tex should be more strongly determined by just the (1,1) line, but maybe the fitting routine is finding a local minimum in some of these pixels where the (2,2) line has very low S/N.
I believe this is a case where the degeneracy between T_ex
, N
, and T_K
gets quite bad and unresolved by the 2-2 nondetection.
I can reproduce the original fit:
chi^2 = 1373.1923003654931
But if we change the column density by a large amount, forcing it to be 3x lower:
chi^2 = 1398.4753438487257
So the fit isn't very different.
These actually all look like poor fits to me, unfortunately. The lines look like they are much narrower than the fitted parameters give. If I force the linewidth to 0.1 km/s:
and chi^2 = 1386.5301240246829
The problem appears to be many degenerate models. For these low S/N cases, where the lack of a 2-2 detection results in this degeneracy, the only workaround may be to apply some sort of prior, either by specifying a limited range of column densities or excitation temperatures or by trying a bayesian approach. The latter is very expensive in both human and computational time, so ideally we'd just ignore these pixels; if they're scientifically interesting, we'll have to come up with something.
One interesting test we can and should do is take some typical parameters we think are correct, make synthetic spectra, and see how well the parameters are recovered as a function of noise level. It will probably reveal what sort of bias we have when we go to low S/N. I think Young Min has done this a little.
@keflavich @jpinedaf @low-sky This might be more related to pyspeckit than the GAS code, but I wanted to flag it and start a discussion here. I'm finding a lot of points that have very low Tex values, but also very small uncertainties in Tex. These are usually at the edges of real structures, so I assume the fits are actually not that great, but you wouldn't guess that from the uncertainty in the returned parameters.
Here's a plot that shows what I'm seeing for the DR1 regions. DR1_Tex_eTex.pdf
And a glue screenshot showing where the low Tex, low eTex points are in the moment map for L1688:![glue_tex_etex](https://cloud.githubusercontent.com/assets/10502901/26214692/4ee6b4d0-3bcb-11e7-8424-c25a781833af.png)
The line in the first figure was an arbitrary cutoff in Tex that got rid of the weird points for L1688, but an arbitrary cut like this would need to be scaled for each region. I haven't yet looked into whether a S/N cut would be better.