TuringLang / TuringGLM.jl

Bayesian Generalized Linear models using `@formula` syntax.
https://turinglang.org/TuringGLM.jl/dev
MIT License
70 stars 7 forks source link

[Tutorial] Elliptical Slice Sampling #42

Open yebai opened 2 years ago

yebai commented 2 years ago

We should have a tutorial on comparing the ESS sampler against HMC samplers. HMC is very popular, but partially due to availability in Stan. In many interesting cases with Gaussian priors, ESS could be a competitive alternative, I think, both computationally and statistically!

cc @devmotion

storopoli commented 2 years ago

Sure, can you point me towards a nice tutorial to emulate or documentation on ESS?

devmotion commented 2 years ago

https://github.com/TuringLang/EllipticalSliceSampling.jl and in particular the video in the README are helpful I hope 😉

storopoli commented 2 years ago

Thanks, will look into it.

devmotion commented 2 years ago

Just ask if anything is unclear or doesn't work 🙂

rikhuijzer commented 2 years ago

I've tried it on the linear regression and logistic regression by adding:

ess_chn = sample(model, ESS(), 2_000);

Both give

[ESS] does only support one variable (2 variables specified)

1. error(::String)@error.jl:33
2. var"#initialstep#35"(::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}, ::typeof(DynamicPPL.initialstep), ::Random._GLOBAL_RNG, ::DynamicPPL.Model{TuringGLM.var"#bernoulli_model#31"{Int64, Int64, TuringGLM.CustomPrior}, (:y, :X, :predictors, :μ_X, :σ_X, :prior), (:predictors, :μ_X, :σ_X, :prior), (), Tuple{SentinelArrays.ChainedVector{Int64, Vector{Int64}}, Matrix{Float64}, Int64, Int64, Int64, TuringGLM.CustomPrior}, Tuple{Int64, Int64, Int64, TuringGLM.CustomPrior}, DynamicPPL.DefaultContext}, ::DynamicPPL.Sampler{Turing.Inference.ESS{()}}, ::DynamicPPL.TypedVarInfo{NamedTuple{(:α, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:α, Setfield.IdentityLens}, Int64}, Vector{Distributions.LocationScale{Float64, Distributions.Continuous, Distributions.TDist{Float64}}}, Vector{AbstractPPL.VarName{:α, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{Distributions.Product{Distributions.Continuous, Distributions.TDist{Float64}, FillArrays.Fill{Distributions.TDist{Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64})@ess.jl:38
3. initialstep(::Random._GLOBAL_RNG, ::DynamicPPL.Model{TuringGLM.var"#bernoulli_model#31"{Int64, Int64, TuringGLM.CustomPrior}, (:y, :X, :predictors, :μ_X, :σ_X, :prior), (:predictors, :μ_X, :σ_X, :prior), (), Tuple{SentinelArrays.ChainedVector{Int64, Vector{Int64}}, Matrix{Float64}, Int64, Int64, Int64, TuringGLM.CustomPrior}, Tuple{Int64, Int64, Int64, TuringGLM.CustomPrior}, DynamicPPL.DefaultContext}, ::DynamicPPL.Sampler{Turing.Inference.ESS{()}}, ::DynamicPPL.TypedVarInfo{NamedTuple{(:α, :β), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:α, Setfield.IdentityLens}, Int64}, Vector{Distributions.LocationScale{Float64, Distributions.Continuous, Distributions.TDist{Float64}}}, Vector{AbstractPPL.VarName{:α, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:β, Setfield.IdentityLens}, Int64}, Vector{Distributions.Product{Distributions.Continuous, Distributions.TDist{Float64}, FillArrays.Fill{Distributions.TDist{Float64}, 1, Tuple{Base.OneTo{Int64}}}}}, Vector{AbstractPPL.VarName{:β, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64})@ess.jl:37
4. var"#step#17"(::Nothing, ::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}, ::typeof(AbstractMCMC.step), ::Random._GLOBAL_RNG, ::DynamicPPL.Model{TuringGLM.var"#bernoulli_model#31"{Int64, Int64, TuringGLM.CustomPrior}, (:y, :X, :predictors, :μ_X, :σ_X, :prior), (:predictors, :μ_X, :σ_X, :prior), (), Tuple{SentinelArrays.ChainedVector{Int64, Vector{Int64}}, Matrix{Float64}, Int64, Int64, Int64, TuringGLM.CustomPrior}, Tuple{Int64, Int64, Int64, TuringGLM.CustomPrior}, DynamicPPL.DefaultContext}, ::DynamicPPL.Sampler{Turing.Inference.ESS{()}})@sampler.jl:99
step@sampler.jl:74[inlined]
6. macro expansion@sample.jl:124[inlined]
7. macro expansion@ProgressLogging.jl:328[inlined]
8. macro expansion@logging.jl:8[inlined]
devmotion commented 2 years ago

You have to use a Gibbs sampler if the variables don't have a common Gaussian prior.