Closed potter2 closed 9 years ago
I believe I've encountered this issue in previous versions when loading a variable that has a large number of time steps.
From: potter2 notifications@github.com<mailto:notifications@github.com> Reply-To: UV-CDAT/uvcdat reply@reply.github.com<mailto:reply@reply.github.com> Date: Thursday, July 24, 2014 3:28 PM To: UV-CDAT/uvcdat uvcdat@noreply.github.com<mailto:uvcdat@noreply.github.com> Subject: [uvcdat] load and close with large data files doesn't work (#511)
If I select a large files (672,17,192,288) the load and close freezes UVCDAT. Nothing is loaded and I have to force quit UVCDAT
— Reply to this email directly or view it on GitHubhttps://github.com/UV-CDAT/uvcdat/issues/511.
@potter2 still true?
haven’t tried this with 2.0
Gerald (Jerry) Potter NASA Goddard Space Flight Center E-mail gerald.potter@nasa.gov Office 28:S241 Phone 707.315.8314 Analysis and Data Consultant Department GSFC:606.2
From: Charles Doutriaux notifications@github.com<mailto:notifications@github.com> Reply-To: UV-CDAT/uvcdat reply@reply.github.com<mailto:reply@reply.github.com> Date: Friday, October 17, 2014 at 9:45 AM To: UV-CDAT/uvcdat uvcdat@noreply.github.com<mailto:uvcdat@noreply.github.com> Cc: Gerald Potter gerald.potter@nasa.gov<mailto:gerald.potter@nasa.gov> Subject: Re: [uvcdat] load and close with large data files doesn't work (#511)
@potter2https://github.com/potter2 still true?
— Reply to this email directly or view it on GitHubhttps://github.com/UV-CDAT/uvcdat/issues/511#issuecomment-59541118.
ok closing for now, please reopen if still true
I believe I encountered the same issue, except that it didn't freeze my UVCDAT, probably because I've got a rather large 64G phyical RAM. The data (ERA-I 0.75x0.75) is (7304,3,214,480) in shape. On disk it is around 5.0G, but upon loading it uses about 20G RAM. I noticed another issue #1030 about large file problem. In my case I wasn't doing any plotting, just reading in the data.
Also after loading, the data seem to be broken, it is 57499.927934 everywhere every time slice. If I read a smaller subset, e.g. var=fin('z',time=slice(0,1000))
it is OK. The max number of slices I could read in one go is 6188, one more it breaks.
The UVCDAT version is 2.8.0, installed via conda. And below is a list of the packages in conda, after installing UVCDAT, numpy, scipy, matplotlib and basemap:
asciidata=2.10=np19py27_0
backports=1.0=py27_0
basemap=1.0.7=np19py27_0
binaryio=2.10=np19py27_0
cairo=1.12.18=6
cdat_info=2.8=py27_2
cdms2=2.8.1=np19py27_0
cdtime=2.10=np19py27_0
cdutil=2.8=py27_2
cffi=1.9.1=py27_0
clapack=3.2.1=2
cryptography=1.7.1=py27_0
cssgrid=2.10=np19py27_0
curl=7.45.0=0
cycler=0.10.0=py27_0
dbus=1.10.10=0
decorator=4.0.11=py27_0
distarray=2.8=py27_2
dsgrid=2.10=np19py27_0
dv3d=2.6=UVCDAT
enum34=1.1.6=py27_0
esmf=ESMF_6_3_0rp1_ESMP_01=np19py27_2
expat=2.1.0=0
eztemplate=2.8=py27_2
ffmpeg=2.7.0=UVCDAT
fontconfig=2.11.1=6
freetype=2.5.5=1
functools32=3.2.3.2=py27_0
g2clib=1.4.0b=2
genutil=2.10=np19py27_0
geos=3.3.3=0
get_terminal_size=1.0.0=py27_0
glib=2.50.2=1
gst-plugins-base=1.8.0=0
gstreamer=1.8.0=0
hdf5=1.8.15.1=3
hdf5tools=2.6=UVCDAT
icu=54.1=0
idna=2.2=py27_0
ipaddress=1.0.18=py27_0
ipython=5.3.0=py27_0
ipython_genutils=0.2.0=py27_0
jasper=1.900.1=2
jbig=2.1=0
jpeg=9b=0
lapack=3.4.2=UVCDAT
libcdms=2.4.1=UVCDAT
libcf=1.0.beta11=UVCDAT
libdrs_f=1.0.1=2
libffi=3.2.1=1
libgcc=5.2.0=0
libgfortran=3.0.0=1
libiconv=1.14=0
libnetcdf=4.3.3.1=3
libpng=1.6.17=0
libtiff=4.0.6=3
libxcb=1.12=1
libxml2=2.9.4=0
lmoments=2.10=np19py27_0
matplotlib=1.4.3=np19py27_2
mkl=2017.0.1=0
natgrid=2.10=np19py27_0
ncurses=5.9=10
netcdf4=1.1.9=np19py27_0
nose=1.3.7=py27_1
numpy=1.9.2=py27_2
openblas=0.2.14=4
openssl=1.0.2k=2
ort=2.10=np19py27_0
ossuuid=1.6.2=2
output_viewer=1.2.2=py27_0
path.py=10.3.1=py27_0
pathlib2=2.2.1=py27_0
pcre=8.39=1
pexpect=4.2.1=py27_0
pickleshare=0.7.4=py27_0
pip=9.0.1=py27_1
pixman=0.32.6=0
proj4=4.9.2=UVCDAT
prompt_toolkit=1.0.14=py27_0
ptyprocess=0.5.1=py27_0
py2cairo=1.10.0=py27_2
pyasn1=0.2.3=py27_0
pycairo=1.10.0=py27_0
pycparser=2.17=py27_0
pygments=2.2.0=py27_0
pyopenssl=16.2.0=py27_0
pyparsing=2.0.3=py27_0
pyqt=4.11.4=py27_4
python=2.7.13=0
python-dateutil=2.6.0=py27_0
pytz=2017.2=py27_0
qt=4.8.7=3
readline=6.2=2
regridpack=2.10=np19py27_0
requests=2.14.2=py27_0
scandir=1.5=py27_0
scikit-learn=0.16.1=np19py27_0
scipy=0.16.0=np19py27_1
setuptools=27.2.0=py27_0
shgrid=2.10=np19py27_0
simplegeneric=0.8.1=py27_1
sip=4.18=py27_0
six=1.10.0=py27_0
sqlite=3.13.0=0
subprocess32=3.2.7=py27_0
thermo=2.10=py27_0
tk=8.5.18=0
traitlets=4.3.2=py27_0
trends=2.10=np19py27_0
udunits2=2.2.17=2
unidata=2.8=np19py27_2
uvcdat=2.8.0=0
uvcmetrics=2.2.1=np19py27_0
vcs=2.6=UVCDAT
vcsaddons=2.10=py27_0
vistrails=master=UVCDAT
vtk=7.1.0.2.6=uvcdat_master
wcwidth=0.1.7=py27_0
wheel=0.29.0=py27_0
wk=2.10=py27_0
x264=20151006.2245=UVCDAT
xmgrace=2.10=py27_0
xz=5.2.2=1
yasm=1.2.0=UVCDAT
zlib=1.2.8=3
zonalmeans=2.10=np19py27_0
If I select a large files (672,17,192,288) the load and close freezes UVCDAT. Nothing is loaded and I have to force quit UVCDAT