cscherrer / SossMLJ.jl

SossMLJ makes it easy to build MLJ machines from user-defined models from the Soss probabilistic programming language
https://cscherrer.github.io/SossMLJ.jl/stable/
MIT License
15 stars 1 forks source link

Alternatives for linear regression example #109

Open cscherrer opened 4 years ago

cscherrer commented 4 years ago

Hi @DilumAluthge , we currently have

m = @model X, s, t begin
    p = size(X, 2) # number of features
    β ~ Normal(0, s) |> iid(p) # coefficients
    σ ~ HalfNormal(t) # dispersion
    η = X * β # linear predictor
    μ = η # `μ = g⁻¹(η) = η`
    y ~ For(eachindex(μ)) do j
        Normal(μ[j], σ) # `Yᵢ ~ Normal(mean=μᵢ, variance=σ²)`
    end
end;

Just want to be sure you know that we can now write this as

m = @model X, s, t begin
    p = size(X, 2) # number of features
    β ~ Normal(0, s) |> iid(p) # coefficients
    σ ~ HalfNormal(t) # dispersion
    η = X * β # linear predictor
    μ = η # `μ = g⁻¹(η) = η`
    y ~ For(μ) do μⱼ
        Normal(μⱼ, σ) # `Yᵢ ~ Normal(mean=μᵢ, variance=σ²)`
    end
end;

or even

m = @model X, s, t begin
    p = size(X, 2) # number of features
    β ~ Normal(0, s) |> iid(p) # coefficients
    σ ~ HalfNormal(t) # dispersion
    η = X * β # linear predictor
    μ = η # `μ = g⁻¹(η) = η`
    y .~ Normal.(μ,σ)
end;
cscherrer commented 4 years ago

Oh, even better for the classification example:

m = @model X,pool begin
    n = size(X, 1) # number of observations
    p = size(X, 2) # number of features
    k = length(pool.levels) # number of classes
    β ~ Normal(0.0, 1.0) |> iid(p, k) # coefficients
    η = X * β # linear predictor
    μ = NNlib.softmax(η; dims=2) # μ = g⁻¹(η) = softmax(η)
    y .~ UnivariateFinite(pool.levels, μ; pool=pool) 
end;