Open davidwhogg opened 12 years ago
def jackknife(scene, dataStructure, njack = 10):
"""
`scene` is the best-fit scene on *all* the data from The Thresher
`dataStructure` is whatever your crazy-ass data structure is
"""
Nimages = len(dataStructure)
subsampleID = np.array((Nimages / njack) * range(njack)) # note brittleness
assert subsampleID.size == Nimages
np.random.shuffle(subsampleID)
# replace this shit with a Pool(njack).map operation
for id in range(njack):
loo = dataStructure[subsampleID != id]
loo_scene[id] = # re-optimize the scene using the loo sample
photometry[id] = # re-measure photometry
rms = # jackknife operation on photometry!
return the, shizzle
The S/N analysis we pair-coded today (2012-06-25) is very non-conservative; it assumes independent, Gaussian pixels and so on. A much more conservative option would be jackknife:
Take the N images (30, 100, 300, whatever), divide into 10 subsamples, make all 10 leave-one-out subsamples (the complements of the small subsamples), reoptimize the scene for each of these leave-one-out subsamples, and then inflate the rms among leave-one-out subsamples by a factor of 10 or 9 or whatever it is to get the jackknife rms.
This is a good idea, and very conservative.