tensorly / tensorly

TensorLy: Tensor Learning in Python.
http://tensorly.org
Other
1.55k stars 290 forks source link

bug of mask parameter in parafac function using numpy backend #196

Closed quanjunc closed 4 years ago

quanjunc commented 4 years ago

Describe the bug

When I try to use mask parameter in parafac function for missing values, it doesn't work.

Steps or Code to Reproduce

factors = parafac(tensor=mask_img, init='random', rank=100,n_iter_max=1000,random_state=416,non_negative=True,mask=mask)

Expected behavior

Actual result

--------------------------------------------------------------------------- ``` ValueError Traceback (most recent call last) in ----> 1 factors = parafac(tensor=mask_img, init='random', rank=100,n_iter_max=1000,random_state=416,non_negative=True,mask=mask) ~/miniconda3/envs/pytorch/lib/python3.8/site-packages/tensorly/decomposition/candecomp_parafac.py in parafac(tensor, rank, n_iter_max, init, svd, normalize_factors, orthogonalise, tol, random_state, verbose, return_errors, non_negative, mask, cvg_criterion) 189 190 if mask is not None: --> 191 tensor = tensor*mask + tl.kruskal_to_tensor((None, factors), mask=1-mask) 192 193 mttkrp = unfolding_dot_khatri_rao(tensor, (None, factors), mode) ~/miniconda3/envs/pytorch/lib/python3.8/site-packages/tensorly/kruskal_tensor.py in kruskal_to_tensor(kruskal_tensor, mask) 186 T.transpose(khatri_rao(factors, skip_matrix=0))) 187 else: --> 188 full_tensor = T.sum(khatri_rao([factors[0]*weights]+factors[1:], mask=mask), axis=1) 189 190 return fold(full_tensor, 0, shape) ~/miniconda3/envs/pytorch/lib/python3.8/site-packages/tensorly/tenalg/_khatri_rao.py in khatri_rao(matrices, weights, skip_matrix, reverse, mask) 96 # Note: we do NOT use .reverse() which would reverse matrices even outside this function 97 ---> 98 return T.kr(matrices, weights=weights, mask=mask) ~/miniconda3/envs/pytorch/lib/python3.8/site-packages/tensorly/backend/__init__.py in inner(*args, **kwargs) 158 159 def inner(*args, **kwargs): --> 160 return _get_backend_method(name)(*args, **kwargs) 161 162 # We don't use `functools.wraps` here because some of the dispatched ~/miniconda3/envs/pytorch/lib/python3.8/site-packages/tensorly/backend/numpy_backend.py in kr(self, matrices, weights, mask) 67 matrices = [m if i else m*self.reshape(weights, (1, -1)) for i, m in enumerate(matrices)] 68 ---> 69 return np.einsum(operation, *matrices).reshape((-1, n_columns))*mask 70 71 @property ValueError: operands could not be broadcast together with shapes (529500,100) (500,353,3) ``` #### Versions

It seems that there is an array shape mistake in line 69 of numpy_backend.py. Therefore I try to use Pytorch backend and it works well.

aarmey commented 4 years ago

Could you try the version of Tensorly on master? I believe this will be fixed in the next release.

https://stackoverflow.com/questions/15268953/how-to-install-python-package-from-github

quanjunc commented 4 years ago

Could you try the version of Tensorly on master? I believe this will be fixed in the next release.

https://stackoverflow.com/questions/15268953/how-to-install-python-package-from-github

I have tried the version of Tensorly on master. It can work well now.

JeanKossaifi commented 4 years ago

Great, closing this. Feel free to reopen if you have other issues.