Closed matthewcarbone closed 1 year ago
Each dimension has its own length scale in the current implementation: https://github.com/ziatdinovmax/gpax/blob/a8e46a0684498f07ff32c18a7f7c47e7f14137d6/gpax/models/gp.py#L238C34-L238C34
@ziatdinovmax yup my bad, you have to set input_dim
explicitly though or it defaults to 1 (which perhaps it should).
Is there any way to use GPax in its current state with kernels of the form e.g.,
$$k(\mathbf{x}, \mathbf{x}') = e^{-\lambda_1(x_1 - x_1')^2} e^{-\lambda_2(x_2 - x_2')^2} e^{-\lambda_3(x_3 - x_3')^2}$$?
Where each dimension gets its own length scale, allowing for greater flexibility. Akin to
ard_num_dims
in GPyTorch. It shouldn't be too hard, right? Should just boil down to modifying the kernel codes so thatparams["k_length"]
broadcasts correctly.