pliang279 / HighMMT

[TMLR 2022] High-Modality Multimodal Transformer
MIT License
107 stars 7 forks source link

python private_test_scripts/perceivers/roboticstasks.py model.pt #4

Closed Gust-simon closed 1 month ago

Gust-simon commented 1 month ago

Hello, excuse me, while I am running python private_test_scripts/perceivers/roboticstasks.py model.pt report errors Output will be model.pt Traceback (most recent call last): File "private_test_scripts/perceivers/roboticstasks.py", line 27, in trains3, valid3, test3 = PushTask.get_dataloader(16, batch_size=18, drop_last=True, test_multimodal_only=True, test_noises=[0]) File "/tmp/pycharm_project_906/HighMMT-main/datasets/gentle_push/data_loader.py", line 85, in get_dataloader train_trajectories = cls.get_train_trajectories(dataset_args) File "/tmp/pycharm_project_906/HighMMT-main/datasets/gentle_push/data_loader.py", line 135, in get_train_trajectories return _load_trajectories("gentle_push_1000.hdf5", dataset_args) File "/tmp/pycharm_project_906/HighMMT-main/datasets/gentle_push/data_loader.py", line 248, in _load_trajectories with fannypack.data.TrajectoriesFile( File "/root/.local/lib/python3.8/site-packages/fannypack/data/_trajectories_file.py", line 77, in init with self._h5py_file() as f: File "/root/.local/lib/python3.8/site-packages/fannypack/data/_trajectories_file.py", line 354, in _h5py_file return h5py.File(self._path, mode=mode, libver="latest") File "/root/anaconda3/envs/jtsaw/lib/python3.8/site-packages/h5py/_hl/files.py", line 562, in init fid = make_fid(name, mode, userblock_size, fapl, fcpl, swmr=swmr) File "/root/anaconda3/envs/jtsaw/lib/python3.8/site-packages/h5py/_hl/files.py", line 235, in make_fid fid = h5f.open(name, flags, fapl=fapl) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5f.pyx", line 102, in h5py.h5f.open OSError: Unable to synchronously open file (file signature not found)

lvyiwei1 commented 1 month ago

This error usually happens when your downloaded file is incomplete or corrupted. Have you checked if your file's hash matches the original file's?

Gust-simon commented 1 month ago

I'm not sure if it matches because I downloaded the HDF5 file directly from Google Cloud Drive through the server.Then,I try to open HDF5 file and get this picture.Is it possible that the HDF5 file in Google Cloud Drive is damaged and cannot be accessed? 1

lvyiwei1 commented 1 month ago

The SHA512 I got for the file "gentle_push_1000.hdf5" is this:

Algorithm       Hash                                                                   
---------       ----                                                                   
SHA512          07F1C5D8D7C633143D6AB82E155A1908F8C7383E55FEFECBDD617B8E73573240E83...

The whole file is 501,646,117 bytes

Can you check if these match with your downloaded one?

Gust-simon commented 1 month ago

Yes.The overall file byte count and SHA512 are the same, but my SHA512 is lowercase. 1

lvyiwei1 commented 1 month ago

Hmm, that's really strange. I can load the file fine with fannypack. 1726785953(1) Can you try the same from the python console and see if you can directly load the hdf5 with fannypack?

Gust-simon commented 1 month ago

Thank you very much. I found that there is a problem with the HDF5 version I downloaded