It is possible to provide malformed gram matrices to svmpredict when using precomputed kernels:
using LIBSVM
# Training data
X = [-2 -1 -1 1 1 2;
-1 -1 -2 1 2 1]
y = [1, 1, 1, 2, 2, 2]
# Testing data
T = [-1 2 3;
-1 2 2]
# Precomputed matrix for training (corresponds to linear kernel)
K = X' * X
model = svmtrain(K, y, kernel=Kernel.Precomputed)
# Precomputed matrix for prediction
KK = X' * T
# truncate KK
KK_malformed = KK[1:1,:]
ỹ, decision_values = svmpredict(model, KK_malformed)
Output:
julia> ỹ
3-element Vector{Int64}:
2
2
2
julia> decision_values
2×3 Matrix{Float64}:
NaN NaN NaN
0.0 0.0 0.0
Expected behavior
As described in the README, the gram matrix should have dimensions (l, n) when predicting n items on l training vectors and produce an error otherwise. Alternatively, it could also accept gram matrices of shape (k, n) where k is the number of support vectors of the model.
I've implemented the behavior as described above and added simple test cases. If you consider this a good change, let me know and I can make a pull request.
Current behavior
It is possible to provide malformed gram matrices to
svmpredict
when using precomputed kernels:Output:
Expected behavior
As described in the README, the gram matrix should have dimensions
(l, n)
when predictingn
items onl
training vectors and produce an error otherwise. Alternatively, it could also accept gram matrices of shape(k, n)
wherek
is the number of support vectors of the model.