markovmodel / PyEMMA

🚂 Python API for Emma's Markov Model Algorithms 🚂
http://pyemma.org
GNU Lesser General Public License v3.0
306 stars 119 forks source link

OSError: "Unable to open file (file signature not found)" h5py/_objects.pyx:55 related error #1584

Open suniliisc opened 1 year ago

suniliisc commented 1 year ago

I am experiencing the following error while saving cluster/msm etc.

Going through the tutorial 04-msm-analysis at ( http://www.emma-project.org/latest/tutorials/notebooks/04-msm-analysis.html)

I have attached the output of conda list. (https://github.com/markovmodel/PyEMMA/files/10069918/conda_list.csv)

Executed command: cluster.save('nb4.pyemma', model_name='doublewell_cluster', overwrite=True)

The API at (http://www.emma-project.org/latest/tutorials/notebooks/04-msm-analysis.html) also has a similar error. I got the following error:

OSError Traceback (most recent call last) Input In [63], in <cell line: 1>() ----> 1 cluster.save('nb4.pyemma', model_name='doublewell_cluster', overwrite=True)

File ~/anaconda3/envs/MSM/lib/python3.8/site-packages/pyemma/_base/serialization/serialization.py:238, in SerializableMixIn.save(self, file_name, model_name, overwrite, save_streaming_chain) 212 def save(self, file_name, model_name='default', overwrite=False, save_streaming_chain=False): 213 r""" saves the current state of this object to given file and name. 214 215 Parameters (...) 236 >>> np.testing.assert_equal(m.P, inst_restored.P) # doctest: +SKIP 237 """ --> 238 from pyemma._base.serialization.h5file import H5File 239 try: 240 with H5File(file_name=file_name, mode='a') as f:

File ~/anaconda3/envs/MSM/lib/python3.8/site-packages/pyemma/_base/serialization/h5file.py:22, in 20 import numpy as np 21 from io import BytesIO ---> 22 from .pickle_extensions import HDF5PersistentPickler 24 author = 'marscher' 26 logger = logging.getLogger(name)

File ~/anaconda3/envs/MSM/lib/python3.8/site-packages/pyemma/_base/serialization/pickle_extensions.py:68, in 64 pass 67 # we cache this during runtime ---> 68 _DEFAULT_BLOSC_OPTIONS = _check_blosc_avail() 71 class HDF5PersistentPickler(Pickler): 72 # stores numpy arrays during pickling in given hdf5 group. 73 def init(self, group, file):

File ~/anaconda3/envs/MSM/lib/python3.8/site-packages/pyemma/_base/serialization/pickle_extensions.py:46, in _check_blosc_avail() 44 fid, name = tempfile.mkstemp() 45 try: ---> 46 with h5py.File(name) as h5f: 47 try: 48 h5f.create_dataset('test', shape=(1,1), **blosc_opts)

File ~/anaconda3/envs/MSM/lib/python3.8/site-packages/h5py/_hl/files.py:533, in File.init(self, name, mode, driver, libver, userblock_size, swmr, rdcc_nslots, rdcc_nbytes, rdcc_w0, track_order, fs_strategy, fs_persist, fs_threshold, fs_page_size, page_buf_size, min_meta_keep, min_raw_keep, locking, alignment_threshold, alignment_interval, kwds) 525 fapl = make_fapl(driver, libver, rdcc_nslots, rdcc_nbytes, rdcc_w0, 526 locking, page_buf_size, min_meta_keep, min_raw_keep, 527 alignment_threshold=alignment_threshold, 528 alignment_interval=alignment_interval, 529 kwds) 530 fcpl = make_fcpl(track_order=track_order, fs_strategy=fs_strategy, 531 fs_persist=fs_persist, fs_threshold=fs_threshold, 532 fs_page_size=fs_page_size) --> 533 fid = make_fid(name, mode, userblock_size, fapl, fcpl, swmr=swmr) 535 if isinstance(libver, tuple): 536 self._libver = libver

File ~/anaconda3/envs/MSM/lib/python3.8/site-packages/h5py/_hl/files.py:226, in make_fid(name, mode, userblock_size, fapl, fcpl, swmr) 224 if swmr and swmr_support: 225 flags |= h5f.ACC_SWMR_READ --> 226 fid = h5f.open(name, flags, fapl=fapl) 227 elif mode == 'r+': 228 fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)

File h5py/_objects.pyx:54, in h5py._objects.with_phil.wrapper()

File h5py/_objects.pyx:55, in h5py._objects.with_phil.wrapper()

File h5py/h5f.pyx:106, in h5py.h5f.open()

OSError: Unable to open file (file signature not found)


stale[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.