Keck-DataReductionPipelines / MosfireDRP

http://keck-datareductionpipelines.github.io/MosfireDRP
10 stars 13 forks source link

DRP sigma spectrum not handled correctly? #40

Open themiyan opened 8 years ago

themiyan commented 8 years ago

I noticed in the code that in the final stages of the rectification, the standard deviation of dithered positions are added linearly: line 212-224 in Rectify.py.

    S = output.shape
    img = solution["eps_img"]
    std = solution["sd_img"]
    tms = solution["itime_img"]

    for i_solution in xrange(1,len(all_solutions)):
            info("Combining solution %i" %i_solution)
            solution = all_solutions[i_solution][i_slit]
            img += solution["eps_img"]
            std += solution["sd_img"]
            tms += solution["itime_img"]

I guess this should be added in quadrature, else we will be overestimating the error? Is this really the case or am I interpreting the code incorrectly?

Thanks,

Themiya.

nickkonidaris commented 8 years ago

This looks like a problem; I think it's a holdover from the days where I reported variance rather than SD. Somehow I thought I got this right, but it looks as if the bug has persisted. Sorry.

csteidel commented 8 years ago

Let’s not be hasty about this conclusion, because at the time we were fixing this issue (~2 years ago?), I looked at things carefully along with Nick and they were being handled correctly. It’s important to look at the full accounting of the calculation, and it will also depend on when the integration times are accounted for, whether frames are averaged or summed, etc. Are we sure that “sd” in this context is not actually “sigma^2”?

On Jan 22, 2016, at 11:34 AM, Nick Konidaris notifications@github.com wrote:

This looks like a problem; I think it's a holdover from the days where I reported variance rather than SD. Somehow I thought I got this right, but it looks as if the bug has persisted. Sorry.

— Reply to this email directly or view it on GitHub https://github.com/Keck-DataReductionPipelines/MosfireDRP/issues/40#issuecomment-174021422.

themiyan commented 8 years ago

Thanks Nick & Chuck to looking into this: yes, i remember the code used to give the inverse variance back in the day.

To answer @csteidel questions,

Are we sure that “sd” in this context is not actually “sigma^2”?

It seems to convert the inverse variance to standard deviation in a function before passing it back to handle the co-addition (line 417-421 Rectify.py)

    IV = np.array(ivar_img)
    bad = np.isclose(IV,0)
    IV[bad] = np.inf
    var_img = np.nanmean(1/np.array(IV), axis=0)
    sd_img = np.sqrt(var_img)
when the integration times are accounted for

The integration time for the std in accounted just before the files are being written as one of the final steps:

       header['bunit'] = ('electron/second', 'sigma/itime')
        IO.writefits(std/tms, maskname,
            "{0}_{1}_{2}_sig.fits".format(outname, band, target_name), options,
            overwrite=True, header=header, lossy_compress=False)
whether frames are averaged or summed

Not sure about this but seems in imcombine the stacked exposures for each dithering is considered as the sum/itime of total individual exposures? itime is just the total sum.

themiyan commented 5 years ago

Hi,

I'm wondering if any progress has been made on this issue? See also Appendix B3 of: https://ui.adsabs.harvard.edu/#abs/2018A&A...618A..85S/abstract