torch.set_default_dtype() seems to have a global effect. When used internally in modules, this causes surprising behaviors when using gptorch alongside other PyTorch code that uses other dtypes.
This PR removes those instances in simply uses util.TensorType instead to prevent external effects.
torch.set_default_dtype()
seems to have a global effect. When used internally in modules, this causes surprising behaviors when using gptorch alongside other PyTorch code that uses other dtypes.This PR removes those instances in simply uses
util.TensorType
instead to prevent external effects.