cosmodesi / desilike

BSD 3-Clause "New" or "Revised" License
20 stars 17 forks source link

planck2018_gaussian + neutrino mass doesn't work? #31

Closed pmcdonal closed 7 months ago

pmcdonal commented 8 months ago

I seem to be having a problem here with the self.cosmo["m_ncdm"] returning a list, i.e., [0.052306970895158955] ... https://github.com/cosmodesi/desilike/blob/a533f2ae958028f81f31d52b0e05672fb0f31b6b/desilike/likelihoods/cmb/planck2018_gaussian.py#L191 I'm not sure this is very useful for neutrino mass anyway, so not important to fix, but ... please tell me if this seems likely my fault, not a bug...

adematti commented 8 months ago

ah, good catch! I think you can replace 'mnu': 'm_ncdm' (list of neutrino masses) by 'mnu': 'm_ncdm_tot' (which is the scalar = sum of neutrino masses) in https://github.com/cosmodesi/desilike/blob/a533f2ae958028f81f31d52b0e05672fb0f31b6b/desilike/likelihoods/cmb/planck2018_gaussian.py#L9

pmcdonal commented 8 months ago

I'm not sure what call gets to this line, where what you say may be true, but for me just changing my specified parameter to m_ncdm_tot worked... although that raises another question: The approximation here is to compute the mean and covariance of the chain and take that to define a Gaussian? That might be reasonable in ~well-constrained cases, but for our neutrino situation it is clear that this has no hope. I.e., because of the prior mnu>0, the mean is ~0.09, but then for the Gaussian this becomes the best fit, which completely misrepresents the situation! But with the chain, it seems that we could do a maximum likelihood fit of the Gaussian as a density approximation? I.e., the "mean" would presumably come out negative, but that isn't a problem if this is only intended as a parameterized representation of the chain for positive mass... I might play with this sometime as generally useful, although first I will probably just try using full Planck likelihood...

adematti commented 8 months ago

Yes, definitely, the Gaussian approximation will fail for mnu, and we should do something like you describe... May I ask what you aim at? (we are using cobaya for the 1st cosmo paper, to make everyone's life easier. We do plan to have autodiff Planck likelihoods for the 2nd paper, and run inference within desilike, but it's not there yet --- in 1-2 months or so).

pmcdonal commented 8 months ago

... I was only running it because it appeared in an example I was given... although it would have been nice if it was sufficient because I still haven't figured out how to get full Planck likelihood to work... (at NERSC, crashing on self._camb_params.DoLensing = self['lensing'] in cosmoprimo/camb.py receiving self['lensing']=1.0 and trying to use the float as an index... since I'm writing, minimal example: from desilike.likelihoods import cmb as cmb from desilike.theories import Cosmoprimo cosmo = Cosmoprimo(fiducial='DESI',engine="camb") likelihood=cmb.TTHighlPlanck2018PlikLiteLikelihood(cosmo=cosmo) likelihood()

pmcdonal commented 8 months ago

In general though, it seems it would be nice to be able to fit simple models to chain posteriors, e.g., in order to go beyond Gaussian for BAO distance parameters...

adematti commented 7 months ago

If you want to design a beyond-gaussian surrogate for the BAO likelihoods, please do, and I'll merge it in desilike! Coming back to your issue:

from desilike.likelihoods import cmb as cmb
from desilike.theories import Cosmoprimo
cosmo = Cosmoprimo(fiducial='DESI',engine="camb")
likelihood=cmb.TTHighlPlanck2018PlikLiteLikelihood(cosmo=cosmo)
likelihood()

It was due to bad programming and should be fixed now (please update desilike and pyclass).