Closed douyuqi closed 2 years ago
你好,可以提供dset的print信息吗?然后还有你的机器内存大小是多少?估计是你的内存不够,看看能不能调整chunk来避免这个错误。
print信息我放在附件文本中了,前半部分是dset信息,后面是error。我电脑内存16G,也没开别的东西,应该不存在内存的问题。 谢谢回复!
------------------ 原始邮件 ------------------ 发件人: "miniufo/xgrads" @.>; 发送时间: 2021年9月1日(星期三) 晚上10:04 @.>; @.**@.>; 主题: Re: [miniufo/xgrads] Problem using function 'to_netcdf()' (#29)
你好,可以提供dset的print信息吗?然后还有你的机器内存大小是多少?估计是你的内存不够,看看能不能调整chunk来避免这个错误。
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android.
我没有看到附件哦,你可以直接贴上来的?
WARNING: expected binary file size: 94003200, actual size: 54835200 <xarray.Dataset> Dimensions: (time: 12, y: 320, x: 360) Coordinates: time (time) datetime64[ns] 2020-01-01 2020-02-01 ... 2020-12-01 y (y) float64 0.0 1.5e+04 3e+04 ... 4.755e+06 4.77e+06 4.785e+06 * x (x) float64 0.0 1.5e+04 3e+04 ... 5.355e+06 5.37e+06 5.385e+06 Data variables: (12/17) emi_index (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> demi_index (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> emis_index (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> dep_index (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> diff_index (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> trans_index (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> ... ... cpt_demi (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> cpt_emis (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> cpt_in (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> cpt_dep (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> cpt_diff (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> cpt_out (time, y, x) >f4 dask.array<chunksize=(1, 320, 360), meta=np.ndarray> Attributes: comment: pm 1/1 storage: 99 title: CUACE_emi_index undef: -9999.0 pdef: lcc
Traceback (most recent call last): File "E:\Project\河南\河南气象局\河南气象\code\BiteGrADS-f15f4cf01283c97a73f03dd7e67722a66366ecc0\GrADS.py", line 10, in <module> dset.to_netcdf('E:\EMI_2021_monthly.nc') File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\xarray\core\dataset.py", line 1900, in to_netcdf return to_netcdf( File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\xarray\backends\api.py", line 1086, in to_netcdf writes = writer.sync(compute=compute) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\xarray\backends\common.py", line 167, in sync delayed_store = da.store( File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\array\core.py", line 1040, in store compute_as_if_collection(Array, store_dsk, store_keys, kwargs) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\base.py", line 313, in compute_as_if_collection return schedule(dsk2, keys, kwargs) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\threaded.py", line 79, in get results = get_async( File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\local.py", line 517, in get_async raise_exception(exc, tb) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\local.py", line 325, in reraise raise exc File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\local.py", line 223, in execute_task result = _execute_task(task, data) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\core.py", line 121, in _execute_task return func((_execute_task(a, cache) for a in args)) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\core.py", line 121, in <genexpr> return func((_execute_task(a, cache) for a in args)) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\core.py", line 121, in _execute_task return func((_execute_task(a, cache) for a in args)) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\core.py", line 121, in <genexpr> return func((_execute_task(a, cache) for a in args)) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\core.py", line 121, in _execute_task return func((_execute_task(a, cache) for a in args)) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\core.py", line 121, in <genexpr> return func((_execute_task(a, cache) for a in args)) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\dask\core.py", line 121, in _execute_task return func(*(_execute_task(a, cache) for a in args)) File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\xgrads\io.py", line 448, in read_var data = __read_continuous(file, pos, shape, dtype, File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\xgrads\io.py", line 492, in read_continuous data = np.memmap(f, dtype=dtype, mode='r', offset=offset, File "C:\Users\douyuqi\AppData\Local\Programs\Python\Python39\lib\site-packages\numpy\core\memmap.py", line 267, in new mm = mmap.mmap(fid.fileno(), bytes, access=acc, offset=start) OSError: [WinError 8] 内存资源不足,无法处理此命令。
------------------ 原始邮件 ------------------ 发件人: "miniufo/xgrads" @.>; 发送时间: 2021年9月2日(星期四) 下午4:50 @.>; @.**@.>; 主题: Re: [miniufo/xgrads] Problem using function 'to_netcdf()' (#29)
我没有看到附件哦,你可以直接贴上来的?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android.
我看到一句warning:expected binary file size: 94003200, actual size: 54835200,根据网格计算的数据文件大小是94MB(320×360×12×变量数×4),但是实际的数据文件只有54MB,估计数据不完整?
如果这个问题解决了还出错,可以发一份数据给我,我来调试一下。
Hi, I am using python 3.9 and i have a problem using xgrads to convert a GrADS dataset to a NetCDF dataset with function tp_netcdf().
The error infomation is:
OSError: [WinError 8] 内存资源不足,无法处理此命令。
Any idea?