Open idontgetoutmuch opened 2 years ago
I tried
@model function conjugateNormal(data)
mu ~ Normal(0.0,1.0)
data ~ Normal(mu,1.0)
# Do we need to return anything? According to
# https://turing.ml/v0.22/docs/using-turing/quick-start we don't
# but the infiniteGMM example returns the number of components
return mu
end
cn = conjugateNormal(4.0)
rng = MersenneTwister(1729)
ϵ = 0.05
τ = 10
chain = sample(rng, cn, HMC(ϵ, τ), 10)
chain = sample(rng, cn, NPMHSampler(), 10)
HMC returns an answer but NPMH returns an error and my Julia skills are not good enough to work out what is going wrong.
julia> chain = sample(rng, cn, NPMHSampler(), 10)
Sampling 0%| | ETA: N/ASampling 100%|████████████████████████████████████████████| Time: 0:00:00
ERROR: MethodError: no method matching getloglikelihood(::DynamicPPL.Model{typeof(conjugateNormal), (:data,), (), (), Tuple{Float64}, Tuple{}, DynamicPPL.DefaultContext})
Closest candidates are:
getloglikelihood(::NPMHModel) at ~/nonparametric-mh/NPMH/src/model.jl:38
I just spotted
# NP-MH
pointwisek = PointwiseAuxKernel((n, t)-> n == 1 ? DiscreteUniform(max(0,t[n]-1),t[n]+1) : Normal(t[n], 1.0))
npmh = NPMHModel(pointwisek, gmm)
but I don't know how I should handle my simple example.
This seems to work
@model function conjugateNormal(data)
# This does nothing but without it NPMH becomes unhappy
k ~ Poisson(3.0)
mu ~ Normal(0.0,1.0)
n = length(data)
for i in 1:n
data[i] ~ Normal(mu,1.0)
end
# Do we need to return anything? According to
# https://turing.ml/v0.22/docs/using-turing/quick-start we don't
# but the infiniteGMM example returns the number of components
# return mu
end
cn = conjugateNormal(vcat(4.0))
rng = MersenneTwister(1729)
cp = NPMHModel(pointwisek, cn)
chain = sample(rng, cp, NPMHSampler(), 15000)
df = DataFrame(A = [chain[i][2] for i in 50001:15000])
CSV.write("foo-julia.csv", df)
I coded up the GMM example in Haskell using Sam Staton's code from https://www.cs.uoregon.edu/research/summerschool/summer19/topics.php#Staton. The test data is generated from https://github.com/cmaarkol/nonparametric-mh/blob/main/infinite_gmm_npmh.jl#L17-L19
gmm :: Meas Double Int
gmm = do
p <- poisson 3.0
let bigK = 1 + floor p
mus <- sequence $ replicate bigK $ normal' 0.0 1.0
ws <- sequence $ replicate bigK $ normal' 0.0 1.0
let ws' = map abs $ map (+ 0.5) ws
ws'' = map (/ sum ws') ws'
f = mixtureOfNormals mus (repeat 0.3) ws''
mapM_ (score . f) testData
return bigK
This produces (with 100000 samples):
which seems pretty good to me.
I will try running the Julia code with more than 1000 samples and report back.
See also: https://github.com/fzaiser/nonparametric-hmc/issues/4
Just in case, I have put the full code for my Haskell example here: https://github.com/idontgetoutmuch/oplss-staton/blob/experiments/src/forTicket.hs
From the paper at https://arxiv.org/pdf/2211.01100.pdf
But when I run the code I get