theogf / AugmentedGaussianProcesses.jl

Gaussian Process package based on data augmentation, sparsity and natural gradients
https://theogf.github.io/AugmentedGaussianProcesses.jl/dev/
Other
135 stars 9 forks source link

MOSVG not working properly #76

Open vincehass opened 3 years ago

vincehass commented 3 years ago

Hi, I am trying to run a GP multi-output regression with inducing points, however it seems the code is broken somewhere, here the code :

using AugmentedGaussianProcesses const AGP = AugmentedGaussianProcesses using Distributions using Plots

N = 1000 X = reshape((sort(rand(N)) .- 0.5) * 40.0, N, 1) σ = 0.01

function latent(x) 5.0 sinc.(x) end Y = vec(latent(X) + σ randn(N))

Run sparse classification with an increasing number of inducing points

Ms = [4, 8, 16, 32, 64]

Create an empty array of GPs

models = Vector{AbstractGP}(undef,length(Ms) + 1) kernel = SqExponentialKernel()# + PeriodicKernel()

for (index, num_inducing) in enumerate(Ms) @info "Training with $(num_inducing) points" m = MOSVGP(X,Y,kernel,GaussianLikelihood,AnalyticSVI(10),2,num_inducing) @time train!(m, 100) # Train the model for 100 iterations models[index] = m # Save the model in the array end

I am getting this error :

[ Info: Training with 4 points ERROR: LoadError: MethodError: no method matching MOSVGP(::Array{Float64,2}, ::Array{Float64,1}, ::SqExponentialKernel, ::Type{GaussianLikelihood}, ::AnalyticVI{Float64,1}, ::Int64, ::Int64) Closest candidates are: MOSVGP(::Union{AbstractArray{T,N} where N, AbstractArray{var"#s28",1} where var"#s28"<:(AbstractArray{T,N} where N)}, ::AbstractArray{var"#s26",1} where var"#s26"<:(AbstractArray{T,1} where T), ::Union{Kernel, AbstractArray{var"#s25",1} where var"#s25"<:Kernel}, ::Union{AbstractArray{var"#s24",1} where var"#s24"<:TLikelihood, TLikelihood}, ::TInference, ::Int64, ::Union{Int64, AbstractArray{var"#s22",1} where var"#s22"<:InducingPoints, InducingPoints}; verbose, atfrequency, mean, variance, optimiser, Aoptimiser, Zoptimiser, ArrayType) where {T<:Real, TLikelihood<:Likelihood, TInference<:Inference}

Any help will be appreciated. Thank you.

theogf commented 3 years ago

Hi thanks for noticing this bug. These examples are unfortunately obsolete and I haven't fixed them yet. Note that (due to my own confusion) these might not be the multi-output you desire: Here it's a multi-output in the sense of f_k = \sum_i a_ki \tilde{f}_i, so a simple linear combination of functions.

For this example to work, you could just replace :

function latent(x)
5.0 * sinc.(x)
end
Y = vec(latent(X) + σ * randn(N))

by

latent1(x) = 5.0 * sinc.(x)
latent2(x) = 2.0 * tanh.(x)
latent3(x) = 0.01 * abs.(x)
Y = [vec(latent(X) + σ * randn(N)) for latent in [latent1, latent2, latent3]]
vincehass commented 3 years ago

Thank you for the response, however Y input in MOSVGP does not seem to accept a Matrix or a vector of vector , it accepts only a 1-D vector so I am still getting the same error.