Closed danielsf closed 6 years ago
It looks like those two scripts write to ${SIMS_DATA_DIR}. Would it make sense to have the execution of both of those scripts be part of the build or install scripts for the sims distribution? I'm worried that this is essentially requiring a build step that needs to be remembered and done by hand every time we build or install a new sims stack.
Regarding the scripts writing to $SIMS_DATA_DIR
: I did not make running them a part of the official Project installation of lsst_sims
because they download about 2GB of data, and not everyone who installs lsst_sims
will need these light curve templates. We could certainly make the a part of the DESC installation process without any trouble.
Depending on how we are actually handling the DESC installation, it may not be a problem. sims_data
moves very slowly, so, if we really are just using eups distrib install
, I would expect every stack we have installed in the past year points to the same $SIMS_DATA_DIR
. That should also be true if we are using lsstsw
(as long as we don't get into a case where we have to re-clone lsstsw
and start the installation from zero). In that case, we would only need to run these scripts once for all time. If we end up using Docker images, then yes: the person building the Docker images will have to remember to run these scripts as a part of the process.
This PR turns on stellar variability in DC2. There are two incidental consequences of this:
1) You will need to run the scripts
get_kepler_light_curves.sh
andget_mdwarf_flares.sh
in$SIMS_CATUTILS_DIR/support_scripts
to download the light curves underlying those two models.2) The actual code that generates the InstanceCatalogs will load these files into memory to speed up running. This will increase the memory footprint of InstanceCatalog generation to between 6.5 GB and 7.5 GB total (this is the sum of the
VIRT
andRES
columns intop
). Let me know if that is a problem.