dougsm / ggcnn

Generative Grasping CNN from "Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach" (RSS 2018)
BSD 3-Clause "New" or "Revised" License
485 stars 139 forks source link

open epoch_29_model.hdf5 file error ! #2

Closed Tony-TF closed 5 years ago

Tony-TF commented 5 years ago

HI,I have a question about this <HDF5 file "epoch_29_model.hdf5" (mode r)> Traceback (most recent call last): File "1.py", line 6, in img_ids = np.array(f['test/img_id']) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "/usr/local/lib/python2.7/dist-packages/h5py/_hl/group.py", line 262, in getitem oid = h5o.open(self.id, self._e(name), lapl=self._lapl) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5o.pyx", line 190, in h5py.h5o.open KeyError: 'Unable to open object (component not found)'

it's happen when open epoch_29_model.hdf5. file Thanks !

dougsm commented 5 years ago

Hi! epoch_29_model.hdf5 is a pre-trained Keras model file, and should be opened with keras.models.load_model(). Judging by img_ids = np.array(f['test/img_id']), it looks like you're trying to open it as the gg-cnn training dataset. If you want a training dataset you'll have to generate your own following the instructions in the README.

Cheers

Tony-TF commented 5 years ago

when I run 'evaluate.py', KeyError: 'Unable to open object (component not found)' what sholud I do? I think maybe 'dataset_rotated_width_zoom_171219_1516.hdf5' file is empty,so I try to open epoch_29_model.hdf5 file check out. I look forword to you reply. Thanks!

dougsm commented 5 years ago

Hi, I think you are getting that error because, as I said above, the file you are opening is not the right file. You will need to create the dataset file using generate_dataset.py, which should create a file which is approx 25GB, using the instructions in the 'Training' section of README.md Hope that helps.

Tony-TF commented 5 years ago

Hi,I Have a new trouble while generate data! Memory overflow.I have only 8G Memory.Do you know how to deal this matter?

dougsm commented 5 years ago

Hi, You might be able to modify the script so that it writes the generated data to the hdf5 file incrementally rather than trying to store it all in memory at once. Hope that helps.