joshspeagle / dynesty

Dynamic Nested Sampling package for computing Bayesian posteriors and evidences
https://dynesty.readthedocs.io/
MIT License
347 stars 76 forks source link

save sampler.results into file #346

Closed wenbinlu closed 2 years ago

wenbinlu commented 2 years ago

I can't find where this is discussed in the documentation. I'm doing a number of independent sampling runs of the same problem (either by different machines or by the same machine at different time). I would like to save the sampling "Results" for each independent run into a file (e.g., .h5 format). After finishing all these runs, I want to read these files so as to load the "Results" into the memory again, and then I can merge all these independent samples into a big sample using "merge_runs", which takes a list of "Results" as input. Do I need to write my own save and read script or there is already a module that handles this?

segasai commented 2 years ago

You need to write it yourself, but as far as I'm aware it's trivial (if you are okay with pickling)

import pickle
import numpy as np
import dynesty
import dynesty.utils as dyutil

nlive = 100

size = 10  # box size

def loglike(x):
    return -0.5 * np.sum(x**2)

def prior_transform(x):
    return (2 * x - 1) * size

def test():
    # run sampling
    ndim = 2
    rstate = np.random.default_rng(444)
    resl = []
    for i in range(3):
        sampler = dynesty.DynamicNestedSampler(loglike,
                                               prior_transform,
                                               ndim,
                                               nlive=nlive,
                                               rstate=rstate)
        sampler.run_nested()
        resl.append(sampler.results)

    # save the results
    with open('xx.pkl', 'wb') as fp:
        pickle.dump(resl, fp)

def test1():
    # read the results
    with open('xx.pkl', 'rb') as fp:
        resl = pickle.load(fp)
    # merge them
    dyutil.merge_runs(resl)

test()
test1()