Closed Aurelien-Pelissier closed 5 months ago
Just as a sanity check, does the unpickling fail if you run pickle.load(f)
in a separate Python session from pickle.dump(embedding, f)
? So first do pickle dump, then exit Python, then start it again, and then do pickle load?
Tried the above code on Windows 10 and it works without issue.
import pickle
import openTSNE
from openTSNE import TSNE
from sklearn import datasets
print(openTSNE.__version__)
iris = datasets.load_iris()
x, y = iris["data"], iris["target"]
embedding = TSNE().fit(x)
with open('iris.pkl', 'wb') as f:
pickle.dump(embedding, f)
with open('iris.pkl', 'rb') as f:
embedding = pickle.load(f)
y = embedding.transform(x)
print('y.shape:', y.shape)
And it shows:
1.0.1
y.shape: (150, 2)
My environment is as follows:
Python 3.11.5 | packaged by Anaconda, Inc. | (main, Sep 11 2023, 13:26:23) [MSC v.1916 64 bit (AMD64)] on win32
joblib==1.3.2
numpy==1.26.2
openTSNE==1.0.1
scikit-learn==1.3.2
scipy==1.11.4
threadpoolctl==3.2.0
Maybe it is fixed or caused by other factors?
I have the same issue: it seems to happen when the dataset is large.
I think it is due to nearest_neighbors.py > setstate where AnnoyIndex needs to load from a file 'tmp.ann' but the file will then be deleted by the with block in the temp folder.
with tempfile.TemporaryDirectory() as dirname:
with open(path.join(dirname, "tmp.ann"), "wb") as f:
f.write(base64.b64decode(b64_index))
self.index.load(path.join(dirname, "tmp.ann"))
I wonder if self.index.load can load from memory or disable memory mapping to free/close the file after loading. https://github.com/spotify/annoy/issues/629
For now, a temporary fix is to keep the data in the temp folder to be used by AnnoyIndex. Replace these four lines to
with tempfile.NamedTemporaryFile(suffix=".ann", delete=False) as tmp_file:
with open(tmp_file.name, "wb") as f:
f.write(base64.b64decode(b64_index))
self.index.load(tmp_file.name)
Note the delete=False
. Can you try @Aurelien-Pelissier?
I'm experiencing a similar issue when working on Ubuntu with NFS as the openTSNE file system.
What we see is that nearest_neighbors
is trying to delete a file located on the NFS, a file that is still in use by the process it self.
The error we see is (opentsne==1.0.0):
File "/usr/local/lib/python3.10/dist-packages/openTSNE/nearest_neighbors.py", line 358, in __setstate__
with tempfile.TemporaryDirectory() as dirname:
2024-04-16T13:27:42.383059144Z File "/usr/lib/python3.10/tempfile.py", line 1008, in __exit__
2024-04-16T13:27:42.383061357Z self.cleanup()
2024-04-16T13:27:42.383062548Z File "/usr/lib/python3.10/tempfile.py", line 1012, in cleanup
2024-04-16T13:27:42.383063695Z self._rmtree(self.name, ignore_errors=self._ignore_cleanup_errors)
2024-04-16T13:27:42.383064831Z File "/usr/lib/python3.10/tempfile.py", line 994, in _rmtree
2024-04-16T13:27:42.383066005Z _rmtree(name, onerror=onerror)
2024-04-16T13:27:42.383067058Z File "/usr/lib/python3.10/shutil.py", line 731, in rmtree
onerror(os.rmdir, path, sys.exc_info())
2024-04-16T13:27:42.383069274Z File "/usr/lib/python3.10/shutil.py", line 729, in rmtree
2024-04-16T13:27:42.383070790Z os.rmdir(path)
2024-04-16T13:27:42.383072030Z OSError: [Errno 39] Directory not empty: '/tmp/tmp9zxbbwwq'
My assumption is that the code "works" without issues on standard unix/linux based systems due to the delete on last close
practice-
A practice in Unix/Linux, where an application has a file open but issues a delete (unlink) on that file anyway.
In a native Linux file system (as opposed to NFS and of course, Windows OS), this will result in the file becoming invisible to other processes, even though it still exists and is still open.
Perhaps I'm wrong but this makes sense to me, especially after reading this comment in issue #210
Moved discussion to #210.
I'm reopening a previously closed issue (https://github.com/pavlin-policar/openTSNE/issues/210) In version 1.0.0, the problem still occurs. I tried running my code as administrator and the issue persists. There seems to be a problem where the ann file is used by two processes simultaneously
Expected behaviour
On Windows OS, when trying to save the TSNEEmbedding object, or affinities, tried to save it with pickle.dump(embeddings,open(os.path.join(self.models_path,"tsne_global_embeddings.sav"),"wb")) or also tried to save as array to reconstruct the object later using numpy.save("file.npy",affinities)
These lines both work just fine under linux distributions what I tried. But loading them back on Windows breaks with the same error as on the save methods, both scenario.
Actual behaviour
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\tobia\\AppData\\Local\\Temp\\tmpvvo0p0t8\\tmp.ann'
Steps to reproduce the behavior
opentsne >= 0.6.2