Closed MyLiIt closed 3 years ago
Hi,
Thanks you for your interest. I haven't looked at directPosteriorDistribution
in a while, so it may be a little out of date compared to the rest of the code. Regardless, it should work here if you remove the x
dependency by just filling it into the distribution directly:
directPosteriorDistribution[{1}, NormalDistribution[a, 1/10], {"LocationParameter"}, {{a, 0, 2}} ]
I never wrote this function with regression-type data in mind, so if you want to do that sort of thing with more data you'll have to calculate the log likelihood function first.
Hi, Thanks for your rapid reply!
To compute a Loglike is probably not a problem, I might simply write it as: directPosteriorDistribution[{1}, ProbabilityDistribution[Likelihood[NormalDistribution[0, 1/10], {1 - a x}], {x, 0, 2}], {"LocationParameter"}, {{a, 0, 2}}]
and indeed I match (within the errors) the results from result["LogEvidence"].
Nevertheless, I still do not understand the syntax to be used by directPosteriorDistribution. What is supposed to enter in the first slot (i.e. where you have {1} in the previous case)? Let's suppose I have no 1 but 2 data points:
objt = defineInferenceProblem[ "Data" -> {1 -> 1, 2 -> 2}, "GeneratingDistribution" -> NormalDistribution[ a x, 1/10], "Parameters" -> {{a, 0, 2}}, "IndependentVariables" -> {x}, "PriorDistribution" -> {"LocationParameter"} ] result = nestedSampling[objt]
How do I use (and match within the errors) result["LogEvidence"]) and directPosteriorDistribution?
ThX
In my example, the {1}
is the output datapoint corresponding to the input x -> 1
(which is the value I filled in to the generating distribution), but this will not generalise to multiple regression-type datapoints. Like I said, I never wrote directPosteriorDistribution
to be used with regression (i.e., rule-based) data; it's just for distribution fitting (like EstimatedDistribution
). I also kinda doubt how well this approach scales for multiple datapoints anyway, which is why I never developed it too much. In the future I might take a look at this again, but unfortunately I don't have time right now.
If you really want to, you should be able to query the object generated by defineInferenceProblem
for the likelihood function and prior density; multiply them; and then throw that into NIntegrate
yourself.
HI, Yes, absolutely. NIntegrate (for simple cases) is indeed an option to check the nestapproach performances numerically. I will do. Best
Hi,
Thanks for your nice code.
I was trying to compare the Log Evidence computed by NestedSampling with a direct computation.
For example:
objt = defineInferenceProblem[ "Data" -> {1 -> 1}, "GeneratingDistribution" -> NormalDistribution[ a x, 1/10], "Parameters" -> {{a, 0, 2}}, "IndependentVariables" -> {x}, "PriorDistribution" -> {"LocationParameter"} ]
result = nestedSampling[objt]
I am interested to compare the value obtained by result["LogEvidence"] against a direct computation. Here it should be not difficult at all but, I've seen you already implemented a command directPosteriorDistribution for doing it.
Unfortunately, I didn't manage to have it run: do you have any example of the syntax to use directPosteriorDistribution, please?
ThX