stcorp / harp

Data harmonization toolset for scientific earth observation data
http://stcorp.github.io/harp/doc/html/index.html
BSD 3-Clause "New" or "Revised" License
55 stars 18 forks source link

CLibraryError #301

Closed pm21ms212 closed 7 months ago

pm21ms212 commented 7 months ago

CLibraryError: [HDF5] H5F__super_read(): truncated file: eof = 45882317, sblock->base_addr = 0, stored_eof = 76428232 (major="File accessibility", minor="File has been truncated") (D:\bld\hdf5_1692562111796\work\src\H5Fsuper.c:603)

I've run the code multiple times for 2022 and 2023 to merge the Sentinel 5P data using HARP and it worked efficiently for those years but while running for 2018, 2019, 2020, and 2021-year data, it's giving me this unwanted error again, and again even if I'm merging the data for smaller range in bin_spatial function. Here, Below is the complete code: import harp import numpy as np import matplotlib.pyplot as plt import cartopy.crs as ccrs from cmcrameri import cm import eofetch import os import matplotlib.ticker as mticker from cartopy.mpl.gridliner import LONGITUDE_FORMATTER, LATITUDE_FORMATTER os.chdir("E:/1_t")

operations = ";".join([ "CH4_column_volume_mixing_ratio_dry_air_validity>50", "keep(latitude_bounds,longitude_bounds,datetime_start,datetime_length,CH4_column_volume_mixing_ratio_dry_air)", "derive(datetime_stop {time} [days since 2000-01-01])", "derive(datetime_start [days since 2000-01-01])", "exclude(datetime_length)", "bin_spatial(101, 41, 0.1, 161, -5, 0.1)", # Adjusted for France latitude and longitude "derive(latitude {latitude})", "derive(longitude {longitude})", ]) reduce_operations = "squash(time, (latitude, longitude, latitude_bounds, longitudebounds));bin()" filenames = "S5P*.nc" merged = harp.import_product(filenames, operations, reduce_operations=reduce_operations)

print(merged)

harp.export_product(merged, 's5p-ch4_L3_1_2021.nc')