CobayaSampler / cobaya

Code for Bayesian Analysis
http://cobaya.readthedocs.io/en/latest/
Other
128 stars 127 forks source link

sigma8 from CAMB should be zero when WantTransfers=False #201

Closed cmbant closed 3 years ago

cmbant commented 3 years ago

Looks like some things that should be zero (sigma8 when WantTransfer=False) are coming out as O(1e-310) in the chain files, cf. https://github.com/CobayaSampler/cobaya/issues/200

JesusTorrado commented 3 years ago

Are there being stored in the Collection DataFrame as 0 but saved to txt as 1e-300, or given as 1e-300 by CAMB (and then simply managed unaltered by Cobaya)?

cmbant commented 3 years ago

I'm not sure, I've not tried to reproduce it

JesusTorrado commented 3 years ago

@louisl3grand can you help us create a minimal working example to reproduce the bug, please?

If it's just Cobaya passing a very small number from CAMB to GetDist, I would say this is not a Cobaya issue: on the one hand, CAMB should return 0 where it is meant to return 0, and on the other GetDist should be more resilient to nearly zero values. I don't think it makes much sense for Cobaya to redefine machine precision arbitrarily with sth like if abs(x) < 1e-100: x = 0, especially for a number that is just passing through.

louisl3grand commented 3 years ago

Hi @JesusTorrado

I was able to reproduce the bug with this input.yaml file.

theory:
  camb:
    extra_args:
      WantTransfer: False
      halofit_version: mead
      bbn_predictor: PArthENoPE_880.2_standard.dat
      lens_potential_accuracy: 1
      num_massive_neutrinos: 1
      nnu: 3.046
      theta_H0_range:
      - 20
      - 100
likelihood:
  planck_2018_lensing.native: null
params:
  logA:
    prior:
      min: 1.61
      max: 3.91
    ref:
      dist: norm
      loc: 3.05
      scale: 0.001
    proposal: 0.001
    latex: \log(10^{10} A_\mathrm{s})
    drop: true
  As:
    value: 'lambda logA: 1e-10*np.exp(logA)'
    latex: A_\mathrm{s}
  ns:
    prior:
      min: 0.8
      max: 1.2
    ref:
      dist: norm
      loc: 0.965
      scale: 0.004
    proposal: 0.002
    latex: n_\mathrm{s}
  theta_MC_100:
    prior:
      min: 0.5
      max: 10
    ref:
      dist: norm
      loc: 1.04109
      scale: 0.0004
    proposal: 0.0002
    latex: 100\theta_\mathrm{MC}
    drop: true
    renames: theta
  cosmomc_theta:
    value: 'lambda theta_MC_100: 1.e-2*theta_MC_100'
    derived: false
  H0:
    latex: H_0
    min: 20
    max: 100
  ombh2:
    prior:
      min: 0.005
      max: 0.1
    ref:
      dist: norm
      loc: 0.0224
      scale: 0.0001
    proposal: 0.0001
    latex: \Omega_\mathrm{b} h^2
  omch2:
    prior:
      min: 0.001
      max: 0.99
    ref:
      dist: norm
      loc: 0.12
      scale: 0.001
    proposal: 0.0005
    latex: \Omega_\mathrm{c} h^2
  omegam:
    latex: \Omega_\mathrm{m}
  omegamh2:
    derived: 'lambda omegam, H0: omegam*(H0/100)**2'
    latex: \Omega_\mathrm{m} h^2
  mnu: 0.06
  omega_de:
    latex: \Omega_\Lambda
  YHe:
    latex: Y_\mathrm{P}
  Y_p:
    latex: Y_P^\mathrm{BBN}
  DHBBN:
    derived: 'lambda DH: 10**5*DH'
    latex: 10^5 \mathrm{D}/\mathrm{H}
  tau:
    prior:
      min: 0.01
      max: 0.8
    ref:
      dist: norm
      loc: 0.055
      scale: 0.006
    proposal: 0.003
    latex: \tau_\mathrm{reio}
  zre:
    latex: z_\mathrm{re}
  sigma8:
    latex: \sigma_8
  s8h5:
    derived: 'lambda sigma8, H0: sigma8*(H0*1e-2)**(-0.5)'
    latex: \sigma_8/h^{0.5}
  s8omegamp5:
    derived: 'lambda sigma8, omegam: sigma8*omegam**0.5'
    latex: \sigma_8 \Omega_\mathrm{m}^{0.5}
  s8omegamp25:
    derived: 'lambda sigma8, omegam: sigma8*omegam**0.25'
    latex: \sigma_8 \Omega_\mathrm{m}^{0.25}
  A:
    derived: 'lambda As: 1e9*As'
    latex: 10^9 A_\mathrm{s}
  clamp:
    derived: 'lambda As, tau: 1e9*As*np.exp(-2*tau)'
    latex: 10^9 A_\mathrm{s} e^{-2\tau}
  age:
    latex: '{\rm{Age}}/\mathrm{Gyr}'
  rdrag:
    latex: r_\mathrm{drag}
sampler:
  mcmc:
    drag: true
    oversample_power: 0.4
    proposal_scale: 1.9
    covmat: auto
    Rminus1_stop: 0.01
    Rminus1_cl_stop: 0.2

And then evaluating the values of the parameters in one point of the likelihood in a notebook

from cobaya.yaml import yaml_load
from cobaya.model import get_model

path = 'input.yaml'

with open(path) as f:
    info = yaml_load(f)

model_test = get_model(info)

params = list(model_test.parameterization.sampled_params().keys())

fiducial = {
    'logA':np.log(2.119631e-09 *1e10),
    'ns': 0.9636852,
    'H0': 67.01904,
    'theta_MC_100':0.010407224216796775 *100,
    'ombh2': 0.02216571,
    'omch2': 0.1202944,
    'tau':0.06018107,
    'A_planck':1,
}

model_test.logposterior([fiducial[par] for par in params])

Changing the line WantTransfer to True gives correct values of sigma8, while WantTransfer: False gives wrong values.

JesusTorrado commented 3 years ago

Thanks, @louisl3grand

It looks that the number comes directly from a call to a CAMB function.

Anyway, I see what's happening now: when sigma8 is requested, Cobaya is setting WantTransfer = True (see [1]), but this is later overridden by extra_args, so CAMB is requested a quantity that it has not computed.

IMO, we should do two things:

@cmbant do you agree, at least with the Cobaya part?

[1] https://github.com/CobayaSampler/cobaya/blob/a97a6945f63763673abeb28e58ad1f634487314a/cobaya/theories/camb/camb.py#L441 [2] https://github.com/cmbant/CAMB/blob/dec68f121844a0940fadb828c54fc995241095f8/camb/results.py#L847

cmbant commented 3 years ago

In Cosmomc for fast testing it can be useful to turn of transfers and just get sigma8=0 in the outputs, but I'd agree probably most naturally done in Cobaya by not requesting sigma8 (or related). Not sure why CAMB isn't producing zero (single-double conversion would give something larger than 1e-310?)

JesusTorrado commented 3 years ago

In Cosmomc for fast testing it can be useful to turn of transfers and just get sigma8=0 in the outputs, but I'd agree probably most naturally done in Cobaya by not requesting sigma8 (or related).

Do you mean e.g. assigning 0 in [1] if extra_args[WantTransfers] is False? Would that work for you?

[1] https://github.com/CobayaSampler/cobaya/blob/a97a6945f63763673abeb28e58ad1f634487314a/cobaya/theories/camb/camb.py#L564

cmbant commented 3 years ago

I think CAMB should just return zero anyway, I can try to check why it doesn't.

JesusTorrado commented 3 years ago

I agree: the least post-processing is needed in cobaya's CAMB interface the better.

Let's close this one together with the next CAMB minor release.

cmbant commented 3 years ago

I added an assertion check for using sigma8 if not WantTransfer, hopefully this should raise error. (assertion is consistent with some other WantTransfer checks elsewhere)

https://github.com/cmbant/CAMB/commit/d344004b7f68b1aa6a5a862d79cff20d25ee5c52

JesusTorrado commented 3 years ago

Ok. But this way CAMB does not return 0, but fails, right? So the CosmoMC functionality you mentioned is not preserved (though limited in use, since it only would make sense in background-cosmology-only cases, e.g. distance-ladder, BBN...?)

cmbant commented 3 years ago

Yes, I think more consistent for Cobaya this way as can easily remove sigma8 from yaml if don't want it

JesusTorrado commented 3 years ago

Sure. Closing, since not critical and will automatically be fixed by the next CAMB release.