uchicago-cs / deepdish

Flexible HDF5 saving/loading and other data science tools from the University of Chicago
http://deepdish.io
BSD 3-Clause "New" or "Revised" License
270 stars 59 forks source link

Overflow Error When Attempting to Save Large Amounts of Data #41

Open 0Maximus0 opened 4 years ago

0Maximus0 commented 4 years ago

I have been using deepdish to save dictionaries with large amounts of data. I ran into the following issue when attempting to save a particularly large file. I have tried saving the data with and without compression, if that helps. Can you help me out with it please?

File "C:/Users/xxxxxxxx/Documents/Python_Scripts/Data_Scripts/Finalized_Data_Review_Presentations/data_save_cctest.py", line 513, in dd.io.save('%s/Data/%s%s_cc_data.h5'%(directory,m_list[m],list_type),cc_data,('blosc', 9))

File "C:\Users\xxxxxxxx\AppData\Local\Continuum\anaconda2\lib\site-packages\deepdish\io\hdf5io.py", line 596, in save filters=filters, idtable=idtable)

File "C:\Users\xxxxxxxx\AppData\Local\Continuum\anaconda2\lib\site-packages\deepdish\io\hdf5io.py", line 304, in _save_level _save_pickled(handler, group, level, name=name)

File "C:\Users\xxxxxxxx\AppData\Local\Continuum\anaconda2\lib\site-packages\deepdish\io\hdf5io.py", line 172, in _save_pickled node.append(level)

File "C:\Users\xxxxxxxx\AppData\Local\Continuum\anaconda2\lib\site-packages\tables\vlarray.py", line 547, in append self._append(nparr, nobjects)

File "tables/hdf5extension.pyx", line 2032, in tables.hdf5extension.VLArray._append

OverflowError: Python int too large to convert to C long