pysat / pysatCDF

Python reader for NASA CDF file format
BSD 3-Clause "New" or "Revised" License
10 stars 7 forks source link

ZLIB_ERROR #14

Closed mattkjames7 closed 5 years ago

mattkjames7 commented 5 years ago

Hi rstoneback, I'm having a peculiar problem with using pysatCDF to load a specific dataset on a specific machine. I'm trying to load data from the PWE instrument, part of the Arase mission (link to files below). I managed to get the CDF files to load on my own desktop, but I receive a "ZLIB_ERROR" (see below) every time I try on the university server:

In [1]: import pysatCDF

In [2]: cdf = pysatCDF.CDF('erg_pwe_hfa_l2_spec_high_20170323_v01_01.cdf')
---------------------------------------------------------------------------
IOError                                   Traceback (most recent call last)
<ipython-input-2-83e7fffe3f12> in <module>()
----> 1 cdf = pysatCDF.CDF('erg_pwe_hfa_l2_spec_high_20170323_v01_01.cdf')

/home/m/mkj13/.local/lib/python2.7/site-packages/pysatCDF/_cdf.pyc in __init__(self, fname)
     79             self._read_all_z_attribute_data()
     80         else:
---> 81             raise IOError(fortran_cdf.statusreporter(status))
     82 
     83     def __enter__(self):

IOError: ZLIB_ERROR: Error during ZLIB decompression.

The data I am using are from: https://ergsc.isee.nagoya-u.ac.jp/data/ergsc/satellite/erg/pwe/hfa/l2/spec/high/

I have not had this problem with any other datasets yet. I have also tried using different versions of Python, but I get exactly the same problem. I don't suppose you have any idea how to fix this?

Thanks Matt.

mattkjames7 commented 5 years ago

After some messing around, I realised that the ZLIB_ERROR was probably due to the extracted file (~1 GB) exceeding the quota limit in my home directory, so changing $CDF_TMP such that it pointed to a directory without a quota stopped this error. Unfortunately I now get a segmentation fault during the loading process both on my home desktop and the university server. Could this be a problem with the version of CDF compiled with pysatCDF? I'm not sure how you tell what version of CDF was used to create a file, but they may well be made using the CDF version 3.7.0.

rstoneback commented 5 years ago

Hi Matt,

Thanks for the report. I will add incorporating the latest CDF library into pysatCDF to the list.

If you want to get a jump on it, you can try adding the latest CDF code to your local version of pysatCDF, updating the path name in setup.py, and invoking python setup.py install.

Cheers, Russell

On May 28, 2019, at 8:38 AM, Dr Matt James notifications@github.com<mailto:notifications@github.com> wrote:

After some messing around, I realised that the ZLIB_ERROR was probably due to the extracted file (~1 GB) exceeding the quota limit in my home directory, so changing $CDF_TMP such that it pointed to a directory without a quota stopped this error. Unfortunately I now get a segmentation fault during the loading process both on my home desktop and the university server. Could this be a problem with the version of CDF compiled with pysatCDF? I'm not sure how you tell what version of CDF was used to create a file, but they may well be made using the CDF version 3.7.0.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHubhttps://github.com/rstoneback/pysatCDF/issues/14?email_source=notifications&email_token=AA3N26ITHAGKBTTRNJ6UQ23PXUYVTA5CNFSM4HOVUTIKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODWMEX4Y#issuecomment-496520179, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AA3N26MMNTSZXDMZGWZOO2TPXUYVTANCNFSM4HOVUTIA.

mattkjames7 commented 5 years ago

Hi Russell, Thanks for the reply. I tried updating the version of CDF - this half worked on my desktop, I could load one file, if I tried to load another after it would complain about an invalid pointer. It didn't make any difference on the university computer unfortunately. I have subsequently tried opening CDFs from other components of the PWE instrument and they have worked fine - I think the problem may lie with the files themselves. Thanks, Matt.