tensorly / torch

TensorLy-Torch: Deep Tensor Learning with TensorLy and PyTorch
http://tensorly.org/torch/
BSD 3-Clause "New" or "Revised" License
70 stars 18 forks source link

FactorizedEmbedding doesn't work with non-contiguous input #31

Closed jemisjoky closed 1 year ago

jemisjoky commented 1 year ago

I'm using version 0.4.0 of tensorly-torch and am running into an issue with non-contiguous inputs to a FactorizedEmbedding layer raising a runtime error. Here's a minimum working example:

import torch
from tltorch.factorized_layers import FactorizedEmbedding

embedding = FactorizedEmbedding(32, 16)
data = torch.randint(0, 32, (2, 3))
embedding(data.T)

Running this yields:

Traceback (most recent call last):
  File "tltorch_bug_mwe.py", line 6, in <module>
    embedding(data.T)
  File "/Users/jemis/opt/miniconda3/envs/tensorized3.8/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1194, in _call_impl
    return forward_call(*input, **kwargs)
  File "/Users/jemis/opt/miniconda3/envs/tensorized3.8/lib/python3.8/site-packages/tltorch/factorized_layers/factorized_embedding.py", line 100, in forward
    flatenned_input = input.view(-1)
RuntimeError: view size is not compatible with input tensor's size and stride (at least one dimension spans across two contiguous subspaces). Use .reshape(...) instead.

Replacing input.view(-1) with input.reshape(-1) does indeed resolve this.

JeanKossaifi commented 1 year ago

Thanks @jemisjoky and thanks for opening such a good issue with a clear snippet to reproduce the error and the proposed fix! It's fixed in 5f4d0e5

jemisjoky commented 1 year ago

Thanks @JeanKossaifi!