tensorly / tensorly

TensorLy: Tensor Learning in Python.
http://tensorly.org
Other
1.51k stars 281 forks source link

numpy.core._exceptions._ArrayMemoryError #526

Closed SCIKings closed 8 months ago

SCIKings commented 9 months ago

Describe the bug

I want to complete the CP decomposition using parafac, but as long as the tensor is too large to complete and produces the following error. How should I solve this problem? I am very eager for your help. import numpy as np from tensorly.decomposition import parafac np.random.seed(0) tensor = np.random.randn(512, 3, 3, 512) weights, factors = parafac(tensor, rank=4) for factor in factors: print(factor.shape) error:numpy.core._exceptions._ArrayMemoryError: Unable to allocate 4.50 TiB for an array with shape (786432, 786432) and data type float64 Windows-10-10.0.19045-SP0 Python 3.9.17 (main, Jul 5 2023, 20:47:11) [MSC v.1916 64 bit (AMD64)] NumPy 1.24.3 SciPy 1.11.2 TensorLy 0.8.1 #### Steps or Code to Reproduce numpy.core._exceptions._ArrayMemoryError: Unable to allocate 4.50 TiB for an array with shape (786432, 786432) and data type float64 #### Expected behavior

Actual result

Versions

JeanKossaifi commented 8 months ago

You can try a random initalization (init='random') ( or use randomized_svd for the svd_fun).

SCIKings commented 8 months ago

Thank you very much for your reply. The problem has been solved. Thank you for your positive contributions to tensor computation. I look forward to your more interesting work.