lesgourg / class_public

Public repository of the Cosmic Linear Anisotropy Solving System (master for the most recent version of the standard code; GW_CLASS to include Cosmic Gravitational Wave Background anisotropies; classnet branch for acceleration with neutral networks; ExoCLASS branch for exotic energy injection; class_matter branch for FFTlog)
230 stars 285 forks source link

how to produce same k points for pk/matter power spectra #389

Open ravi398 opened 3 years ago

ravi398 commented 3 years ago

Hello sir There is change in k points while producing matter power spectra after doing some change in class parameters. i.e. some files created with 624 no of k points and some files create 628 points etc... and even the points are slightly different. This creates problem while calculating the difference /residual between two matter power spectra.

please help me where in class i should change so that i get same k points every time generating pk.

thanking you Regards Ravi kumar sharma

pstoecker commented 3 years ago

Hey Ravi,

could you provide the input parameters of CLASS that you have used in either case? It is then easier to find out what the issue might be. Also which version of class are you using? Is it the recent version (2.9.4)?

Best, Patrick

ravi398 commented 3 years ago

Thank you very much sir for your response.

for example if we set N_ur=3.046 and N_ncdm=1, m_ncdm=2 its pk file has 621 points. and if i change m_ncdm=8 then its pk file has 629 points.

Regards Ravi kumar sharma

ThomasTram commented 3 years ago

Hi Ravi

When the cosmology changes, the actual k-values computed also change. This is necessary to keep the accuracy of the computation at the same level. The usual approach when comparing Pk(k) for two different cosmologies is to interpolate them onto the same k-grid. Some of the observed difference will then come from the interpolation, but that is not usually a problem. (Performing the interpolation in log(pk) and log(k) improves this.)

If you really need to compare the exact same k-values, you can use k_output_values to pass a list of k-values which will be added to the precomputed list of k-values. You can then use some precision parameters to more or less remove all the automatic k-values (k_per_decade_for_pk = 0, k_per_decade_for_bao = 0 off the top of my head...), but note that this is not the recommended way. If you pass more than 30 k-values, the string may overflow, and you would need to increase the maximum length of input lines, (_ARGUMENT_LENGTH_MAX_ in parser.h), and recompile CLASS.

Cheers, Thomas

FigoHerranz commented 3 years ago

Hello,

Did you find a way to solve it?

I am on a similar problem, I need the P(k) for a long string of k-values (something like 1000 k-values), I do not care about the specific k but I need them to be exactly the same when I change the value of the parameters like Omega_cdm or n_s (not a big change, just around a previous cosmology) in the output file "_pk.dat". Due to the length of the k-values putting them ink_output_values does not work.... Interpolation seems to be the only way, I did not find anything in CLASS I can modify to achieve that but any idea is welcomed.

Thanks!

ravi398 commented 3 years ago

Hi I did not find any solution. I used interpolation.

cheers Ravi sharma

FigoHerranz commented 3 years ago

Thanks for answering Ravi!

Dear @ThomasTram ,

You said that the change in k-values is necessary for the accuracy of the computation. Provided the change in the parameters I will do are order 1%, you think that the accuracy will be affected if I force the code to keep the same k? I guess this can be done by modifying the structure int perturb_get_k_list(... but I do not know if it will be be worthwhile due to the loss of accuracy.

Thanks for the help!

ThomasTram commented 3 years ago

Hi @FigoHerranz Of course, for small changes in input parameters, the k-list could be kept the same. The issue here is that the k-list has to change smoothly as a funktion of input parameters. Interpolation is the standard way to solve this, but you could also hack the function perturb_get_k_list to always return the same k-values as you suggest.

You could also use k_output_values, even for 1000 values, by using one or more of the following tricks:

  1. Round the k-values to one or two digits before converting to string.
  2. Increase _LINE_LENGTH_MAX_ and _ARGUMENT_LENGTH_MAX_ in parser.h by e.g. a factor of 100 or 1000 provided that your stack-size allows it. (You can also increase the stack-size of your system if necessary.)
  3. Create a Python function that computes P(k) for a large number of k-values by calling CLASS on smaller chunks of this large k-vector.

Cheers, Thomas

FigoHerranz commented 3 years ago

Hi @ThomasTram ,

Thanks for the tricks!! I have been playing a bit with the code trying to use the second option you suggested with those tricks, so passing a k_output_values with ~1000 values. Just to summarize, I set in ini.file:

k_per_decade_for_pk = 0 k_per_decade_for_bao = 0 k_output_values=...

in parser.h:

#define _LINE_LENGTH_MAX_ 35000 #define _ARGUMENT_LENGTH_MAX_ 35000

and in perturbation.h:

#define _MAX_NUMBER_OF_K_FILES_ 3000

I have tried several configurations of k-values, starting from 10^-6 or 10^-5, or 10^-4..., but CLASS always gives back this error:

Error in perturb_init =>perturb_init(L:390) :error in perturb_solve(ppr, pba, pth, ppt, index_md, index_ic, index_k, pppw[thread]); =>perturb_solve(L:2249) :condition (k/ppw->pvecback[pba->index_bg_a]/ppw->pvecback[pba->index_bg_H] > ppr->start_large_k_at_tau_h_over_tau_k) is true; your choice of initial time for integrating wavenumbers is inappropriate: it corresponds to a time before that at which the background has been integrated. You should increase 'start_large_k_at_tau_h_over_tau_k' up to at least inf, or decrease 'a_ini_over_a_today_default'

I gave several values to a_ini_over_a_today_default with no positive result, so probably I am missing something, do you have any idea of what could be?

Thank a lot for all the help!!!

ThomasTram commented 3 years ago

Hi

Given that it says "increase 'start_large_k_at_tau_h_over_tau_k' up to at least inf" suggests that probably CLASS has understood a k-value as being 0. This suggests that perhaps the string was cut - I think your values should be correct for 1000 k-values, but try to reduce it anyway and see if the problem persists. Also, did you remember to make clean (and restarted the kernel if you are on Jupyter Notebook) after you changed the .h-file?

Cheers, Thomas

FigoHerranz commented 3 years ago

Hi,

It even happens with only one k-value like k_output_values=0.1, so it should not be a problem of reading the string. Indeed, it only happens when both k_per_decade_for_pk = 0 k_per_decade_for_bao = 0 are set, if one of them is commented it does not give any error but of course the k-values used are not the one I pass... I do not know if this suggests anything.

I am not using Jupyter, and I run before the make clean yes.

Thanks!

ThomasTram commented 3 years ago

Ah ok, you are using the k_per_decade_for_pk = 0 and k_per_decade_for_bao = 0 setting that I mentioned above. This is sort of a hack, so that is probably why it is not working. Just increase them to 1 or 2 (whichever solves the problem), and when you compute the powerspectrum just use [pk(kk, z) for kk in k_output_values]. The extra k-values will not affect the output since the internal spline goes through all points.

FigoHerranz commented 3 years ago

Hi,

Yes, but this is only for the python wrapper? Is it possible with the usual way with ./class __.ini?

Thanks for the help.

mingjing-chen commented 2 years ago

Hi,

Yes, but this is only for the python wrapper? Is it possible with the usual way with ./class __.ini?

Thanks for the help.

Hi,

add this to file.ini can change the number of k k_per_decade_for_pk = 100 k_per_decade_for_bao = 100

and add this to file.ini can change k_min and k_max k_min_tau0=0.1 P_k_max_h/Mpc = 1.

so you can get a fixed array of k

kcroker commented 12 months ago

@ravi398 if this issue has been addressed to your satisfaction, could you please close it?