Closed Prabhat1808 closed 5 years ago
How much memory you have in your machine ? The computed matrix is expected to be large and require a lot of memory. The size of output file size is approximately 20 GB. Also, I would recommend using python 3.6
.
I have 64GB memory available. Still I get the following error:
Traceback (most recent call last):
File "compute_distmat.py", line 37, in
P.S. I added some new code(which I've commented). Hence, the mismatch in line numbers.
Is it 32 bits or 64 bits python ? I would recommend using 64 bits to handle such a huge array.
It is 64 bit. The error is resolved though. I changed the python version from 3.6.0 to 3.6.8 and it worked. Thanks!
I have similar problem and I have python 3.6.9 (which is a 64 bit system). I have a 25gb RAM and 107gb of disk memory. Does the post above refer to 64gb of RAM ? Any idea of how to fix this?
2020-06-04 00:45:38.006929: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 Using TensorFlow backend. tcmalloc: large alloc 20000800768 bytes == 0xca0a000 @ 0x7f1b981671e7 0x7f1b95d0d5e1 0x7f1b95d71c78 0x7f1b95d71e24 0x7f1b95d6874c 0x7f1b95fa528c 0x7f1b95e04c10 0x7f1b95e0955a 0x566d63 0x59fc4e 0x7f1b95d5d4ed 0x50a2bf 0x50bfb4 0x507d64 0x509a90 0x50a48d 0x50bfb4 0x507d64 0x50ae13 0x634c82 0x634d37 0x6384ef 0x639091 0x4b0d00 0x7f1b97d64b97 0x5b250a
The size of embedding_matrix in 30050001 due to which there is a memory error while doing -2np.dot(embedding_matrix.T , embedding_matrix). How do I fix this?