Open rmandelb opened 9 years ago
Hironao can tell you the exact exposures, I'm sure (I haven't been using all of them so I don't know what they are), but yes, this functionality exists. Either he or I can run it, or I can show you how on Tiger, if you'd like. We can run just this test by adding
-c "sys_tests.names=['StarXStarShear']"
to the command line.
I’d like you to show me how on Tiger, if you don’t mind.
I'm sending the fiducial fields offline.
Okay, thanks to some off-repo chatter (involving proprietary data), @rmandelb discovered a bug in the correlation function tests for the HSC module which was a failure on my part to copy-paste some boilerplate code for some of the adapters. I've pushed a commit to master to address this, since the code was checked in the PR for #33, it just didn't appear in all the right places. You should now be able to use correlation function types other than 'ng' again.
Thanks! I’ll pull a new version of the code and rerun that calculation.
A quick question I thought I'd put here for public input--in private communication @rmandelb pointed out that the correlation function plots don't currently put units on the x-axis. We could use the kwarg provided to the getCF()
method and save it as a class attribute that can be accessed by the plot()
method. However, you can use the same class instance for multiple correlation functions with different units, so if you generated a bunch of CFs and then plotted them all at once you'd only get the unit of the most recent CF. I think the best plan is to save the unit for later use, but allow an optional kwarg to override the default behavior, and just document that that's what we're doing. Does that sound reasonable to everyone?
By the way, the correlation functions are now written out as ASCII tables on branch #44. They're in the same place as the images, but with a .dat
extension instead of a .png
. The first line is a hash-commented line listing the columns.
Am I going to break anything if I merge this into #39?
Hmm. I don't think so.
I was going to wait for #39 to be merged and then merge this branch into #43, but merging directly into #39 would work too...
I’ve been using #39 for the tests of the star-star correlation function in the real data (so they will be fast) and I need the output tables from that, so I’ll merge it in there.
Just to confirm a few things:
I am doing commands of this form:
/home/rachel.mandelbaum/git/Stile/bin/StileVisit.py $SUPRIME_DATA_DIR --id visit=1214 --rerun ssp1-cosmos-20140609:rachel.mandelbaum/StileTests -c "sys_tests.names=['StarXStarShear']" --clobber-config
on tiger3. Should I assume it's using some default star selection? What is the default? In hsc/base_tasks.py, I see bright_star_sn_cutoff = 50
but it's not clear to me whether that's a cutoff used for PSF tests, or for all star-related calculations.
And just to confirm, this uses the observed star moments, not the PSF model moments, right?
I branched this one from #39 so it should have all the stuff in it as of last night, but you can merge back if you like, that's fine.
It's using a default star selection, yes: data['classification.extendedness']==1
. The bright star cutoff is for some tests that want, specifically, bright stars, not all stars.
I am actually not sure about the moments--let me check and get back to you.
Um, pretend I wrote ==0
in that last comment...
Okay, pinging @HironaoMiyatake and @TallJimbo here--the star shapes are using shape.sdss
which is the measured shape, not the PSF model shape, as far as I can tell. However, I don't know if they're PSF-corrected shapes or raw moments, in general or for this rerun in particular. Any help from the pipeline folks? :)
So it's using all things classified as extended? I'm not sure that's what we want here, in fact I'm pretty sure it's not what we want. Is there an easy way for me to add some requirement on the S/N?
shape.sdss are measured, and they are from the raw moments. You can't really PSF corrected star shapes using the moments-based methods.
Sorry, yes, that was a typo, it's classification_extendedness==0
--guess GitHub didn't send you the comment where I corrected that...
Hmm. It's not exactly easy, in the sense of being doable from the command line, but it's not hard:
stile/sys_tests.py
, go to the StarXStarShearSysTest
and change the line objects_list = ['star']
to objects_list = ['star bright']
. It's line 553 I think. star bright
uses a PSF cutoff, unlike star
.bright_star_sn_cutoff
set in the config, which you can change from the command line.No, I did get that update, I just for some reason I omitted the "not" - I meant "not classified as extended".
I'm fine with the default bright_star_sn_cutoff
, will run it after changing it to use only bright stars and see what happens.
Oh, okay. The star bright
one will still use the classification_extendedness
as well as the S/N--is that all right?
Yes, that's exactly what I want. Compact sources above some S/N instead of all compact sources.
It won't run after I change that line:
Traceback (most recent call last):
File "/home/rachel.mandelbaum/git/Stile/bin/StileVisit.py", line 5, in <module>
VisitSingleEpochStileTask.parseAndRun()
File "/tigress/HSC/products-20130212/Linux64/pipe_base/HSC-3.0.0/python/lsst/pipe/base/cmdLineTask.py", line 353, in parseAndRun
resultList = taskRunner.run(parsedCmd)
File "/tigress/HSC/products-20130212/Linux64/pipe_base/HSC-3.0.0/python/lsst/pipe/base/cmdLineTask.py", line 157, in run
resultList = mapFunc(self, self.getTargetList(parsedCmd))
File "/home/rachel.mandelbaum/git/Stile/stile/hsc/base_tasks.py", line 669, in __call__
result = task.run(*args)
File "/home/rachel.mandelbaum/git/Stile/stile/hsc/base_tasks.py", line 758, in run
temp_mask_list = [sys_test.getMasks(catalog, self.config) for catalog in catalogs]
File "/home/rachel.mandelbaum/git/Stile/stile/hsc/sys_test_adapters.py", line 131, in getMasks
return [mask_func(data, config) for mask_func in self.mask_funcs]
File "/home/rachel.mandelbaum/git/Stile/stile/hsc/sys_test_adapters.py", line 47, in MaskBrightStar
star_mask = MaskStar(data)
TypeError: MaskStar() takes exactly 2 arguments (1 given)
Oh, aie. Not sure what branch you're on now, so I'll let you change it, but the line that failed should be star_mask = MaskStar(data, config)
instead.
And, ah. I read "all things classified as extended" and thought you wanted to change the "classified as extended" and not the "all", sorry!
I’m doing everything on #39.
Is that line change something that is particular to what I’m trying to do? Or is that a general bug that I should commit as part of that PR?
General bug.
Okay, pinging @HironaoMiyatake and @TallJimbo here--the star shapes are using shape.sdss which is the measured shape, not the PSF model shape, as far as I can tell. However, I don't know if they're PSF-corrected shapes or raw moments, in general or for this rerun in particular. Any help from the pipeline folks? :)
It's not corrected for PSF in general. shape.hsm
is corrected for PSF.
(typos are corrected on a GitHub web page)
Hmm, okay. Should we be switching to using that for galaxies? Not for stars, I guess, as per @rmandelb's comments on PSF-correcting star shapes, which would mean I need to mess with the shape-calculating routines. If you guys think so I'll open another issue.
Melanie, when I do this, I get
/home/rachel.mandelbaum/git/Stile/stile/hsc/sys_test_adapters.py:49: RuntimeWarning: invalid value encountered in greater config.bright_star_sn_cutoff)
Is this an ignorable warning? Is it because some of the psf flux values or psf flux error values are NaN or something like that?
Also, I have a question about the docstring for MaskBrightStars: it says " Right now this is set to be the upper 10% of stars in a given sample." Is this out of date? It appears to use a hard cutoff on S/N, not a percentile.
I will push a fix for the bug but please confirm if I should fix this docstring too.
Yes, the docstring should be fixed.
I'm not sure about that warning--I'll go try to reproduce it and find out...
FWIW, it's hanging for an awfully long time after that. I would expect this test to take less time than when using all stars, since it's correlating a lot fewer objects, but it's just been sitting there for a long time after emitting the warning...
Should we be switching to using that for galaxies?
Switching to using what? Sorry, there are too many messages and I'm not sure to what that refers.
Sorry, shape.hsm
(the PSF-corrected shape measurements).
Yes, we should use shape.hsm
for galaxies.
Yes, we should only ever use shape.hsm for galaxies. It doesn’t make sense to do any of these correlations for galaxies in terms of non-PSF corrected shapes.
But the stars should still be shape.sdss
?
Yes. You cannot run this PSF correction routine on stars. It explodes if you do that.
Okay. Will open an issue for this (and refrain from making supernova jokes...)
Rachel, yes that error is due to a nan
somewhere in the data, and it gets masked out so not a problem.
This was going more slowly because the bright-star checker hadn't yet been converted over to using the get-the-schema-ahead-of-time method. I just did that, so if you pull the branch it should go faster.
Woo, sorry that I missed that fix!
Okay, thanks!
It works quickly now, thanks!
Going back to this:
A quick question I thought I'd put here for public input--in private communication @rmandelb pointed out that the correlation function plots don't currently put units on the x-axis. We could use the kwarg provided to the getCF() method and save it as a class attribute that can be accessed by the plot() method. However, you can use the same class instance for multiple correlation functions with different units, so if you generated a bunch of CFs and then plotted them all at once you'd only get the unit of the most recent CF. I think the best plan is to save the unit for later use, but allow an optional kwarg to override the default behavior, and just document that that's what we're doing. Does that sound reasonable to everyone?
I have an even stupider question, and sorry if you already answered this in all the e-mails that have gone back and forth:
Does this mean that the distance calculations might be done wrong since the units are allowed to change?
Well, it would be done wrong if you specified it wrong. But everything's calculated at the time you call the object, it doesn't persist from call to call. So if you call getCF()
with something that wants units in degrees, it will calculate the full correlation function; then if you call getCF()
again with something that wants units in arcmin, it doesn't remember that the previous call was in degrees, and just calculates everything in arcmin. Each call to getCF()
is like a whole new run of TreeCorr.
I think this change makes sense.
Hi @HironaoMiyatake and @msimet -
While I've been busy with teaching I've lost track of what functionality is / is not present.
For the systematics requirements calculation, I would like to calculate the autocorrelation function of star shapes as a function of separation.
I would like this for single exposures (NOT coadd), using the entire FOV. Ideally, it would be nice to see this for a few different pointings, which could give us a bunch of measurements (perhaps a few exposures per pointing?).
Do we have what we need for this? Ibelieve we had defined a fiducial set of exposures for PSF tests, and I would like to use that set of exposures even though this isn't a PSF test (since it's just the star shapes, not the model residuals).