Open dkirkby opened 8 years ago
The relationship between flux and flux error appears to be strongly time dependent, especially in the g band:
Is this a real effect, or some calibration or algorithm artifact?
To give some context to the plot above, it is based on the SDSS DR12 photo resolve file and shows SKYFLUX[1]
vs SKYSIG[1]
with colors based on TAI[1]
(converted to days since 09-19-1998).
The SDSS photo resolve data model does not specify the units of SKYFLUX
and SKYSIG
. Can anyone confirm that they are in nanomaggies / arcsec**2 ?
Here is a basic comparison of the specim total g-band sky prediction (including scattered moon) vs the SDSS data, which someone else should confirm: I used pyephem for the moon ephemerides and astropy for the alt,az transforms. I only used 10% of the data and only plot observations with the moon above the horizon. The data clearly shows structure that is not simulated, but at least the overall slope of the correlation is close to one.
Triggered by dkirkby/speclite#39, I realized that sky photometry estimated from the specsim.atmosphere
model should use filter curves with no atmosphere included. We were accidentally getting this right for SDSS in this issue, but not for the DES comparisons in #55.
Does that mean speclite will include filters both with and without the atmosphere? If so, it would be very helpful if the default filters had the mean atmosphere, as they're being used in many places in desisim and desitarget to synthesize fluxes and colors.
I agree the defaults should include an atmosphere and don't plan to change that.
This is a follow-up to a suggestion from @moustakas on #9. Please post questions and progress here.