Closed steven-murray closed 5 years ago
Ok, yep, I see the issue here. I'll make sure to set up a check for this. I haven't fully tested older versions of the code yet.
This is what is running for me with default parameters. Should this be the case?
The issue is, M_Min is being set, but then its computing M_Min from Tvir_Min (if mass dependent zeta is not used). Tvir_Min results in M_Min changing as a function of redshift. I guess this can then overstep the interpolation table.
I'll need to think of a cleaner way of treating this. Legacy 21CMMC requires Tvir (as in my previous papers), but it seems to have moved to M_Min, even in the absence of a mass dependent ionising efficiency. Which means both M_Min and Tvir should be viable options. And you'll need to have a condition to select one over the other.
I believe I have mixed this, at least it works on my end. It has resulted in another parameter, passed to "FlagOptions" called M_MIN_in_Mass. Allows the user to use Mass or Virial temperature to define the minimum mass of ionising sources.
It will need a logic test. As USE_MASS_DEPENDENT_ZETA should overrule M_MIN_in_Mass if it is False. That is, if USE_MASS_DEPENDENT_ZETA is used, M_MIN_in_Mass must be True. I could do this in C, but I think it is cleaner to do it in Python.
Cool, I'll do this in Python, ASAP.
See c315e1525da677ef958fbd7d794135187239d5e3 for how I've written this logic, and make sure you think it's right!
The logic for this is correct.
However, I wonder if it is useful to have a warning/error instead. For example, the MCMC will start if USE_MASS_DEPENDENT_ZETA = True is set. But, if USE_MASS_DEPENDENT_ZETA should have been false instead it'll warn that USE_MASS_DEPENDENT_ZETA = True and M_MIN_in_Mass = False.
The question is, do we bother guarding against all possible sources of user error
Two things:
Maybe I just don't understand the workings of this, but I feel like if there is only one physically reasonable way of setting the parameters, then it should be done automatically (as is already done). If it's possible, but unlikely, that the parameters could be set in a different configuration (physically), then it should issue a warning and keep going. We can never guard against all possible sources of user error, but each one that does come up we should protect against in some way.
For me, even setting this does not relieve the segfault that I was getting. After a little more investigation, I've found that it occurs in the spin temperature function, at high redshift (possibly also at low redshift, but haven't tested that) if the perturb field passed is at the same redshift that you want to evaluate. When this is the case, the zpp
value can get higher than the redshift
value, and then the integration mass limit goes below M_MIN
and the segfault occurs. Two kinds of solution naturally arise: i) don't allow the perturb field to be at the same redshift as the spin temp box (seems like a bad solution to me), or ii) when doing the integration itself, just check if the mass is smaller than M_MIN, and if so, return zero.
My responses:
I agree. If the user manages to set USE_MASS_DEPENDENT_ZETA = True and M_MIN_in_Mass = False. Then set a warning that M_MIN_in_Mass is being set to True as it cannot be otherwise. I'm thinking it might be useful, at the start of the MCMC for it to output a log summarising all the user defined options.
Ok, I'll look into this one. Don't think I have been running that configuration (as I am always running the configuration to mimic 21cmFAST for testing purposes).
So the thing is, because of default parameters, if the situation arises that USE_MASS_DEPENDENT_ZETA=True and M_MIN_in_Mass=False, it is most likely because the user simply specified the first one as True and forgot about the second. Emitting a warning here would be okay, but probably not necessary, as the user probably doesn't care -- they weren't thinking about the M_MIN_in_Mass parameter. In this case it would be "best" (I think) if it just silently gets set to the only physically plausible value. Of course, if the user really did try to set them both to these weird values, it would then be best to emit the warning, but trying to capture whether it was really user-set, rather than simply left as default, is probably slightly overkill, in my opinion. I think you're right about a log summarising the options. At the moment, in principle the options can be determined, by reading in the saved HDF5 file. However, most people probably just want a simple log that tells them all the options they used. I can write that.
Thanks, yeah I ran into it when specifically testing the spin_temperature
function, which has this behaviour as default (whereas the run_lightcone has the option of not doing it this way, which is usually what you run).
Ok, segmentation fault issue should be fixed now
Excellent, seems to work well!
I am currently getting segfaults, only for spin temperature, which seem to arise from the integral for dndm using m values below the M_Min set for initialiseSigmaMInterp().
The minimum m for the interpolation table seems to be
astro_params->M_TURN
, while the minimum mass for integration seems to beFMAX(TtoM(zpp_grid, astro_params->X_RAY_Tvir_MIN, mu_for_Ts), M_MIN_WDM)
, which I have no idea about what that should be. It doesn't seem like there's a check in there to make sure the latter isn't smaller than the former, and I can't tell if it must be true by construction.I'm sure it's probably some code that I've changed python-side that is causing this error, but it would be good to dig into a bit.