I am running some likelihood analyses and I noticed that every time the covmat file is created, or when there is an update in the covmat, my contours increase, even though when I run the convergence statistics, the R-1 values have decreased. Is this normal behaviour? Shouldn't the contours decrease with increased convergence?
i attach an example here. For the green contours, they have been running for some time (~500000 samples) with a covmat file already created, while for the blue contours there are about 100000 samples with no covmat file created yet. The R-1 for the green contours have reached ~1e-2 magnitude, yet are so much larger than the blue contours. Is this expected behaviour? I'm probably missing something...
PS. I'm plotting the contours with GetDist, with ignore_rows=0.3. Could this be a problem eg. GetDist not removing non-markovian points etc.
Thank you very much for any advice you could provide!
Hi!
I am running some likelihood analyses and I noticed that every time the covmat file is created, or when there is an update in the covmat, my contours increase, even though when I run the convergence statistics, the R-1 values have decreased. Is this normal behaviour? Shouldn't the contours decrease with increased convergence?
i attach an example here. For the green contours, they have been running for some time (~500000 samples) with a covmat file already created, while for the blue contours there are about 100000 samples with no covmat file created yet. The R-1 for the green contours have reached ~1e-2 magnitude, yet are so much larger than the blue contours. Is this expected behaviour? I'm probably missing something...
PS. I'm plotting the contours with GetDist, with ignore_rows=0.3. Could this be a problem eg. GetDist not removing non-markovian points etc.
Thank you very much for any advice you could provide!
Lisa