Makes it possible to write out your lightcone after each redshift is computed, so that it is easier to checkpoint. The motivation here is that while you can always write out EVERYTHING, and then re-running lightcone will not have to compute the already-written boxes, that can take up a huge amount of disk space. Instead, it would be nice if you could maintain only the latest two files of each output kind, kind of "rolling" the cache as it were. I wrote a hook to do this (also in this PR) but even with that, there was no way of "going back" and getting the interpolated lightcone slices. So I re-arranged a bit of the run_lightcone code so that it can identify a lightcone save file, and if it exists, it will read partial information from there.
Fixes some issues with initialization of the sigma(m) table
Adds some new logging statements
Fixed an issue where when you read an object from a file without reading the actual data in, it would not correctly figure out that the data is computed and on disk -- instead it would assume the data had never been computed and try computing it all again.
By default, when running a low-level function, eg. initial_conditions() and it finds a cached box, now we DON'T read in all the data, just the basic metadata. Since it's being read from a file, access to any of the computed arrays will lazily load the data anyway, so this provides a reasonable speedup when the boxes are big -- some of the arrays never need to be loaded at all.
Added new cache_tools.get_boxes_at_redshift function which makes it easier to load all the cached boxes matching some given set of input parameters and redshift specification.
Removed the top-level parameters log10_Mturnover_ave etc. from LightCone as they should be computed using the global_quantities entry to run_lightcone.
This PR does a few main things:
hook
to do this (also in this PR) but even with that, there was no way of "going back" and getting the interpolated lightcone slices. So I re-arranged a bit of therun_lightcone
code so that it can identify a lightcone save file, and if it exists, it will read partial information from there.initial_conditions()
and it finds a cached box, now we DON'T read in all the data, just the basic metadata. Since it's being read from a file, access to any of the computed arrays will lazily load the data anyway, so this provides a reasonable speedup when the boxes are big -- some of the arrays never need to be loaded at all.cache_tools.get_boxes_at_redshift
function which makes it easier to load all the cached boxes matching some given set of input parameters and redshift specification.log10_Mturnover_ave
etc. from LightCone as they should be computed using theglobal_quantities
entry torun_lightcone
.