Closed Jane550 closed 4 months ago
It turned out that the likelihood function used was stochastic, i.e., would randomly yield different results even for the same parameters. nautilus should give accurate results in these cases but it may not be very efficient. That's because the initial exploration phase in nautilus is essentially a global optimization. If the likelihood function is noisy, a global maximum doesn't really exist and nautilus can slow down. The main suggestion in this case would be to increase f_live
to something around 0.5. This makes the exploration phase stop sooner.
The other issue with the progress bar is related tqdm
, by default, not working well with log files. But the current version of nautilus doesn't rely on tqdm
, anyway.
I came across an issue that the exploration phase slows down gradually, from 2it/s to 10s/it. The output file shows that the 'filling bound' process is 'done' even though the progress bar is not yet 100%. For example:
Filling Bound 15: 44%|████▎ | 1305/3000 [4:31:38<4:35:37, 9.76s/it]
Filling Bound 15: done N_like: 137160 N_eff: 2261 log Z: -580.748 log V: -8.339 f_live: 0.461
The exploration phase becomes slower and slower and seems never to end. It is still exploring after 48 hours. Weirdly, I previously run a similar code and the result is normal, returning the sampling posterior within one day. Now I only changed the data (a child sample of the data before), the likelihood function being the same. I don't understand why this issue happens.