Open Balandat opened 6 years ago
Currently, incompatibilities in shape, device or dtype between LazyTensors can be very hard to debug, since they manifest at evaluation time deep inside the code, multiple levels deep.
shape
device
dtype
LazyTensor
Let's add validation of these to all the LazyTensor constructors and/or methods. For instance, in the MatmulLazyTensor constructor we'd validate that both args have the same device and dtype, and that their shapes are compatible for matrix multiplication: https://github.com/cornellius-gp/gpytorch/blob/master/gpytorch/lazy/matmul_lazy_tensor.py#L20 Similarly, we'd check this compatibility in the _matmul method here: https://github.com/cornellius-gp/gpytorch/blob/master/gpytorch/lazy/matmul_lazy_tensor.py#L42
MatmulLazyTensor
_matmul
There are many other places for validation too, including active_dims for the RBFKernel().
RBFKernel()
Currently, incompatibilities in
shape
,device
ordtype
betweenLazyTensor
s can be very hard to debug, since they manifest at evaluation time deep inside the code, multiple levels deep.Let's add validation of these to all the
LazyTensor
constructors and/or methods. For instance, in theMatmulLazyTensor
constructor we'd validate that both args have the samedevice
anddtype
, and that theirshape
s are compatible for matrix multiplication: https://github.com/cornellius-gp/gpytorch/blob/master/gpytorch/lazy/matmul_lazy_tensor.py#L20 Similarly, we'd check this compatibility in the_matmul
method here: https://github.com/cornellius-gp/gpytorch/blob/master/gpytorch/lazy/matmul_lazy_tensor.py#L42