Closed fjaviersanchez closed 6 years ago
I have analysis scripts that need to be updated including a catalog matcher that could be generally usable and modified and copied to here:
https://github.com/cwwalter/DC1_Analysis/tree/master/Scripts
It might be nice to update and combine these so there is a program people can modify to get the output they want and also produce a source matched catalog people can use. Issues with them now:
1) They point to old files that don’t exist anymore
2) Process-Dataframe basically makes a reduced dataframe. But there is also a lot of stuff in there to bit unpack the analysis flags from the FLAGS variables so that we had access to them. That is no longer necessary since Jim now unpacks all the flags using DM and they are all available.
3) Prepare-Truth makes a hdf5 file with a star and galaxy dataframe and deals with the fact that the disks and bulges are entered separately in the dataframe for each galaxy. I combine them together here calculating the combined magnitude etc for later matching. But, it is not really done very well in that you would like to keep all of the original component magnitude and sersic indices. What I did now is a bit of hack and (as I recall) you will get just the combined magnitudes and only one of the indices even if there are two components.
4) Match-Catalog uses astropy to match object to the combined star and galaxy catalog and I think it works well.
I gave these to Jim before since one of Risa’s students were interested in them. So it is possible there is already some work that has been done with these but I don’t know.
We have several notebooks:
I think that the two in the directory https://github.com/LSSTDESC/SSim_DC1/tree/master/Notebooks/Dask are good to just read the data. We can add more plots to the Validation notebook.
This notebook should also be updated and might be useful for people that want to use the butler: https://github.com/LSSTDESC/SSim_DC1/blob/master/scripts/QA_general/Test_DC1.ipynb
It might be interesting to do a small script that get a CatSim instance catalog and generates an small imSim image.
Instructions on how to set up a kernel to use the DM stack can be found here: https://github.com/LSSTDESC/Monitor/blob/master/doc/jupyter-dev.md
@humnaawan just set up an ipython remote session so maybe she can comment on how to do that as well.
The data products from the stack are in :/global/cscratch1/sd/descdm/DC1/rerun/
and the dask dataframes are in /global/projecta/projectdirs/lsst/groups/SSim/DC1
@fjaviersanchez I was using ipython with port forwarding; the details are nicely detailed here.
Butler access demo.ipynb work out of the box.
I think the existing notebooks have all been validated with a current version of the LSST DESC Jupyter-dev kernel.
I think this is done thanks to the amazing work of @kadrlica!
The notebooks in the repo are outdated. It would be good if we updated them to point to the correct paths and clean them from unnecessary stuff (for example we have things like the flags unpacking that should be now deprecated).