The following code freezes unexpectedly after a short time during adding a new bound.
import numpy as np
from nautilus import Sampler
def prior(x):
return x
def likelihood(x):
return np.random.random()
sampler = Sampler(prior, likelihood, 2)
sampler.run(verbose=True)
The reason is that the likelihood is random. Surprisingly, the neural networks seem smart enough to realize this and will predict the same likelihood score for all points in the training set, i.e., the mean likelihood score. The problem is that nautilus accepts new points only if the predicted likelihood score is higher than the predicted likelihood score of the live set. This will never happen, so no new points are accepted, and the bound is never created.
The following code freezes unexpectedly after a short time during adding a new bound.
The reason is that the likelihood is random. Surprisingly, the neural networks seem smart enough to realize this and will predict the same likelihood score for all points in the training set, i.e., the mean likelihood score. The problem is that nautilus accepts new points only if the predicted likelihood score is higher than the predicted likelihood score of the live set. This will never happen, so no new points are accepted, and the bound is never created.