uchicago-cs / deepdish

Flexible HDF5 saving/loading and other data science tools from the University of Chicago
http://deepdish.io
BSD 3-Clause "New" or "Revised" License
271 stars 60 forks source link

Got error when I tried to load the data from matlab #8

Closed ypxie closed 8 years ago

ypxie commented 8 years ago

In matlab:

h5disp('/home/test.h5') HDF5 test.h5 Group '/' Attributes: 'TITLE': '' 'CLASS': 'GROUP' 'VERSION': '1.0' 'PYTABLES_FORMAT_VERSION': '2.1' 'DEEPDISH_IO_VERSION': 8 Dataset 'data' Size: 32x32x3x100 MaxSize: 32x32x3x100 Datatype: H5T_IEEE_F64LE (double) ChunkSize: 32x32x3x2 Filters: unrecognized filter (blosc) Attributes: 'CLASS': 'CARRAY' 'VERSION': '1.1' 'TITLE': '' Dataset 'label' Size: 100 MaxSize: 100 Datatype: H5T_IEEE_F64LE (double) ChunkSize: [] Filters: none FillValue: 0.000000 Attributes: 'CLASS': 'ARRAY' 'VERSION': '2.4' 'TITLE': '' 'FLAVOR': 'numpy'

thisdata = h5read('/home/test.h5', '/data'); Error using h5readc The HDF5 library encountered an error and produced the following stack trace information:

H5PL__find         can't open directory
H5PL_load          search in paths failed
H5Z_pipeline       required filter 'blosc' is not registered
H5D__chunk_lock    data pipeline read failed
H5D__chunk_read    unable to read raw data chunk
H5D__read          can't read data
H5Dread            can't read data

Error in h5read (line 58) [data,var_class] = h5readc(Filename,Dataset,start,count,stride);

gustavla commented 8 years ago

Matlab does not support the default compression in deepdish (and PyTables) 'blosc'. Because of it's narrow support, I plan to change the default in the next version of deepdish. Until then, you can manually change it:

dd.io.save('test.h5', data, compression='zlib')  # compression widely supported
dd.io.save('test.h5', data, compression=None)    # no compressoin

It's unfortunate, since 'blosc' is by far faster than 'zlib' and in most situations as compressive.

ypxie commented 8 years ago

Thanks for your reply That works!