LSSTDESC / DESCBiasChallenge

3 stars 2 forks source link

Adding new fisher code #19

Closed Zhiyuan-G closed 2 years ago

Zhiyuan-G commented 3 years ago

This pull request address the stability of fisher matrix related to Issue #11. A new first derivative expression fisher matrix calculation code is added. The following are the descriptions of things changed/added in this branch with respect to the current main branch:

  1. Two script are added to the cl_like module: theory_vector.py: This is basically the same code in cl_like.py, only with changes that help to explicitly pull out the theory vector from likelihood calculation (might need to optimize it since right now this code has to be changed correspondingly every time when new bias parameters or new bias models are added to cl_like.py) fisher.py: This is script calls theory_vector.py and perform the new fisher matrix calculation
  2. The implementation of the new fisher code is added to the kmax_test.py (see kmax_test_new_fisher.py)
damonge commented 3 years ago

Thanks a lot @Zhiyuan-G . It seems this requires a full copy of the likelihood code onto theory_vec.py, which is not ideal (since then we will need to copy any new code for new bias models twice). Why did you need to do this rather than using the cl_like.py directly?

Zhiyuan-G commented 3 years ago

Thanks a lot @Zhiyuan-G . It seems this requires a full copy of the likelihood code onto theory_vec.py, which is not ideal (since then we will need to copy any new code for new bias models twice). Why did you need to do this rather than using the cl_like.py directly?

Thank you for the comment @damonge. This is not ideal indeed. The reason is that, for the new fisher code, I am not sure how to explicitly retrieve the data covariance and computed theory data vector from the current version of cl_like.py, which is written as an external likelihood class for Cobaya. In the old fisher code, which calculates the derivative of the log-likelihood for a given set of sampled parameters, the value of log-likelihood is obtained directly via the Cobaya function model.loglikes. I wonder is there a similar function from Cobaya that can return theory data vectors too?

anicola commented 3 years ago

@Zhiyuan-G you can use something like the code snippet below to retrieve everything from the likelihood:

from cobaya.yaml import yaml_load

import yaml

from cobaya.model import get_model

info = yaml.load(path2config, Loader=yaml.FullLoader)

model = get_model(info)

model.loglike(param_dict)

ell = model.likelihood['cl_like.ClLike'].cl_meta[probe]['l_eff']

inds = model.likelihood['cl_like.ClLike'].cl_meta[probe]['inds']

cl_th = model.likelihood['cl_like.ClLike']._get_theory(**param_dict)

cl_data = model.likelihood['cl_like.ClLike'].data_vec

err = np.sqrt(np.diag(model.likelihood['cl_like.ClLike'].cov))
Zhiyuan-G commented 3 years ago

Wow, this is super useful! Thanks a lot. I will make changes to the code base on this and let you know when it is done.

Zhiyuan-G commented 3 years ago

@anicola @damonge I have updated the fisher code. Now the code directly pulls theory data vector from cl_like.py. Please let me know if you have any comments.