STOR-i / GaussianProcesses.jl

A Julia package for Gaussian Processes
https://stor-i.github.io/GaussianProcesses.jl/latest/
Other
308 stars 53 forks source link

Translate sklearn to Julia request #190

Closed troyrock closed 3 years ago

troyrock commented 3 years ago

I've been trying to figure out how to utilize GaussianProcesses.jl to duplicate some code I found in python. I'm new to kernels and how they are used so don't hesitate to explain what I'm not understanding. The python code uses the squared exponential kernel to produce n random but smooth functions:

import numpy
import sklearn.gaussian_process
import matplotlib.pyplot as plt

n=10

x = numpy.linspace(0,1,num=1000)[:, None]
K=sklearn.gaussian_process.kernels.RBF(length_scale=0.2)
k = K(x)
l = numpy.linalg.cholesky(k +1e-13*numpy.eye(1000))
u = numpy.random.randn(1000, n)
r = numpy.dot(l, u).T
for num in range(n):
   plt.plot(r[num])
plt.show()

When I plot the results I get plots that look like the following for n=10: Figure_1

Unfortunately I get stuck early on when I want to produce the radial basis functions of x (k=K(x)). Any help is appreciated.

troyrock commented 3 years ago

I was able to do this another way. I wanted to post my results:

using AbstractGPs
using Stheno
using Plots
using KernelFunctions
using LinearAlgebra

l = 0.2

x = range(0.0, 1.0; length=100)
f = GP(SEKernel() ∘ ScaleTransform(1/l))
r = transpose(rand(f(x, 1e-12), 10))

Or

using AbstractGPs
using Stheno
using Plots
using KernelFunctions
using LinearAlgebra

l = 0.2

x = range(0.0, 1.0; length=100)
k = kernelmatrix(SEKernel() ∘ ScaleTransform(1/l), x)
l = cholesky(k+Diagonal(1e-13*ones(100))).L
u = randn(100, 10)
r = transpose(l * u)

p = plot(r2[1,:])
for num in range(2,length=9)
    plot!(p, r2[num,:])
end
p