Closed SamothKoc closed 5 months ago
What happens if you change:
scn = Scene(filenames = [file_nam],:sreader="modis_l1b")
to:
scn = Scene([file_nam],reader="modis_l1b")
Also note that you have a rogue :s
in the above command. If this still doesn't work, can you check that you have the conda HDF libraries installed (pyhdf
from memory).
I downloaded the MODIS files with wget over Laads... so the first option I already checkt (corruped file) should be the problem...
Then I get: TypeError: init() got an unexpected keyword argument 'sreader'
@SamothKoc That's still a typo. It should be reader
not sreader
.
Finally it works :D the key problem was a missing library ... the debugger is great
Edit: Is there a (easy) way with satpy to combine multiple MODIS dataset? I want to plot the complete Track during one day over my region.
Edit 2: I tried this:
def extract_date(file_name):
match = re.search(r"\.A(\d{7})\.", file_name)
return match.group(1) if match else None
def group_files_by_date(file_paths):
"""
Gruppiert eine Liste von Dateipfaden nach dem Datum in den Dateinamen.
:param file_paths: Eine Liste von Dateipfaden.
:return: Ein Dictionary, in dem die Schlüssel Datumsangaben sind und die Werte Listen der Dateipfade sind.
"""
grouped_files = defaultdict(list)
for file_path in file_paths:
date = extract_date(file_path)
if date:
grouped_files[date].append(file_path)
return grouped_files
print('Start Plotting,... whatever.')
file_pattern = os.path.join(modis_folder, '*.hdf')
file_name = glob.glob(file_pattern)
file_name
grouped_files = group_files_by_date(file_name)
ii=0
for date, files in grouped_files.items():
print(f"Verarbeite Dateien für Datum: {date}")
scn = Scene(filenames=files, reader='modis_l1b')
composite_id = ['natural_color']
scn.load(composite_id, upper_right_corner="NE")
and then:
area_id = "mecator"
description = "Mercator projection"
proj_id = "mercator"
proj_dict = {"proj": "merc", "lon_0": 0} # Längengrad des Satelliten
width = 2000 # Anzahl der Pixel in x-Richtung
height = 2000 # Anzahl der Pixel in y-Richtung
#area_extent = (-5570248.477339261, -5567248.074173444, 5567248.074173444, 5570248.477339261)
area_def = AreaDefinition(area_id, description, proj_id, proj_dict, width, height, extend)
# Resampeln der Daten auf das definierte Zielgebiet
scn_resample_nc = scn.resample(area_def)
image = get_enhanced_image(scn_resample_nc["natural_color"]).data.data.compute().transpose(1, 2, 0)
image.shape
crs = scn_resample_nc["natural_color"].attrs["area"].to_cartopy_crs()
# Initiatie a subplot and axes with the CRS information previously defined
fig = plt.figure(figsize=(16, 10))
ax = fig.add_subplot(1, 1, 1, projection=crs)
ax.set_facecolor('white')
# Add coastline features to the plot
ax.coastlines(resolution="10m", color="white")
# Define a grid to be added to the plot
gl = ax.gridlines(draw_labels=True, linestyle='--', xlocs=range(-20,20,5), ylocs=range(0,50,5))
gl.top_labels=False
gl.right_labels=False
gl.xformatter=LONGITUDE_FORMATTER
gl.yformatter=LATITUDE_FORMATTER
gl.xlabel_style={'size':14}
gl.ylabel_style={'size':14}
countries = cfeature.NaturalEarthFeature(
category='cultural',
name='admin_0_countries',
scale='50m',
facecolor='none',
edgecolor='white'
)
ax.add_feature(countries)
ax.imshow(image, transform=crs, extent=crs.bounds, origin="upper")
time_tit= scn['natural_color'].attrs["start_time"].strftime("%Y-%m-%d %H:%M")
plt.title("natural_colord by MODIS at " + time_tit, fontsize=20, pad=20.0)
datetime_str_modified = time_tit.replace(" ", "_")
datetime_str_modified = time_tit.replace(":", "_")
print('Saving time: ', datetime_str_modified)
plt.savefig(save_folder+'MODIS_truecolor_plot'+datetime_str_modified+'.png', dpi=300, bbox_inches='tight', transparent=True)
and get a complete grey image :D
Passing multiple granules to the Scene object should merge them together just fine. Doing them per-orbit (or half-orbit or even smaller) would be a good idea just for processing times and avoiding resampling artifacts.
Does your code produce an expected image for a single set of files (a single granule)? An all grey image sounds like your input data does not overlap/cover your area that you are resampling to.
Hi, it works with one image :D so i said ...
x_size = 3089
y_size = 3320
area_extent = (W, E, N, S)
projection = '+proj=laea +lat_0=37.5 +lon_0=-122.0 +ellps=WGS84'
description = "area"
proj_id = 'laea_37.5_-122.0'
areadef = get_area_def(area_id, description, proj_id, projection,x_size, y_size, area_extent)
but if i have a list of hdf... that isn't the way. because i have to do it manually
If you have multiple time steps, you should process them sequentially.
and if i only know the name... so i can crop the coordinate? Currently I have a big ? in my brain :D
there is a coord2area_def.py but even then i need the coordinate befor.
I'm a little lost now. What Martin said is true and is what I was trying to say. If you have multiple orbits (multiple separate passes of the satellite) then those should really be processed separately (separate Scene objects). If they are contiguous/connected granules then that is probably fine for a couple MODIS granules to pass all those files together...although now that I say that I'm not sure I've actually ever done that.
but if i have a list of hdf... that isn't the way. because i have to do it manually
and if i only know the name... so i can crop the coordinate?
Have to do what manually? Know the "name" of what?
Done. :D it was easier the expected .
area_id = "Points_test"
description = "Points and overall in Mercator-Projektion"
proj_id = "Points_test"
proj_dict = {"proj": "merc", "lat_ts": 0, 'lon_0': 0} # Passen Sie die Projektionsparameter an
width = 800
height = 800
llx = -50E5 # westlich Längenkoordinate
lly = -0E5 # üdlich Breitenkoordinate
urx = 18E5 # östliche Längenkoordinate
Import loury = 47E5 # nördliche Breitenkoordinate
tarea_extent = (llx, lly, urx, ury)
area_def_westafrika = geometry.AreaDefinition(area_id, proj_id, description, proj_dict, width, height, area_extent)
mscn = MultiScene(scenes)
mscn.load(['natural_color'],resolution=1000)
mscn_res = mscn.resample(area_def_Points_test)
blended_scene = mscn_res.blend()
blended_scene.show("natural_color")
Glad you got something working.
Just for reference, Dave, I have often processed multiple contiguous modis granules, up to half an orbit, and it works fine.
Describe the bug Hi,... A want to plot Modis of one overfly during one day and get the following error: Scene(filenames = [file_nam], reader="modis_l1b") Don't know how to open the following files: {'/home/MODIS_DATA/MOD021KM.A2017131.1325.061.2017314123114.hdf'}
Actual results In [57]: Scene(filenames = [file_nam], reader="modis_l1b") Don't know how to open the following files: {'/home/MODIS_DATA/MOD021KM.A2017131.1325.061.2017314123114.hdf'}
ValueError Traceback (most recent call last) Cell In[57], line 1 ----> 1 Scene(filenames = [file_nam], reader="modis_l1b")
File /shared/apps/gcc-11.4.0/miniconda/3.11/envs/satpy_env/lib/python3.8/site-packages/satpy/scene.py:133, in Scene.init(self, filenames, reader, filter_parameters, reader_kwargs) 130 if filenames: 131 filenames = convert_remote_files_to_fsspec(filenames, storage_options) --> 133 self._readers = self._create_reader_instances(filenames=filenames, 134 reader=reader, 135 reader_kwargs=cleaned_reader_kwargs) 136 self._datasets = DatasetDict() 137 self._wishlist = set()
File /shared/apps/gcc-11.4.0/miniconda/3.11/envs/satpy_env/lib/python3.8/site-packages/satpy/scene.py:154, in Scene._create_reader_instances(self, filenames, reader, reader_kwargs) 149 def _create_reader_instances(self, 150 filenames=None, 151 reader=None, 152 reader_kwargs=None): 153 """Find readers and return their instances.""" --> 154 return load_readers(filenames=filenames, 155 reader=reader, 156 reader_kwargs=reader_kwargs)
File /shared/apps/gcc-11.4.0/miniconda/3.11/envs/satpy_env/lib/python3.8/site-packages/satpy/readers/init.py:584, in load_readers(filenames, reader, reader_kwargs) 581 break 583 _check_remaining_files(remaining_filenames) --> 584 _check_reader_instances(reader_instances) 585 return reader_instances
File /shared/apps/gcc-11.4.0/miniconda/3.11/envs/satpy_env/lib/python3.8/site-packages/satpy/readers/init.py:623, in _check_reader_instances(reader_instances) 621 def _check_reader_instances(reader_instances): 622 if not reader_instances: --> 623 raise ValueError("No supported files found") 624 if not any(list(r.available_dataset_ids) for r in reader_instances.values()): 625 raise ValueError("No dataset could be loaded. Either missing " 626 "requirements (such as Epilog, Prolog) or none of the " 627 "provided files match the filter parameters.")
ValueError: No supported files found
In [58]: filenames
NameError Traceback (most recent call last) Cell In[58], line 1 ----> 1 filenames
NameError: name 'filenames' is not defined
In [59]: Scene(filenames = [file_nam],reader="modis_l1b") Don't know how to open the following files: {'/home/MODIS_DATA/MOD021KM.A2017131.1325.061.2017314123114.hdf'}
ValueError Traceback (most recent call last) Cell In[59], line 1 ----> 1 Scene(filenames = [file_nam],reader="modis_l1b")
File /shared/apps/gcc-11.4.0/miniconda/3.11/envs/satpy_env/lib/python3.8/site-packages/satpy/scene.py:133, in Scene.init(self, filenames, reader, filter_parameters, reader_kwargs) 130 if filenames: 131 filenames = convert_remote_files_to_fsspec(filenames, storage_options) --> 133 self._readers = self._create_reader_instances(filenames=filenames, 134 reader=reader, 135 reader_kwargs=cleaned_reader_kwargs) 136 self._datasets = DatasetDict() 137 self._wishlist = set()
File /shared/apps/gcc-11.4.0/miniconda/3.11/envs/satpy_env/lib/python3.8/site-packages/satpy/scene.py:154, in Scene._create_reader_instances(self, filenames, reader, reader_kwargs) 149 def _create_reader_instances(self, 150 filenames=None, 151 reader=None, 152 reader_kwargs=None): 153 """Find readers and return their instances.""" --> 154 return load_readers(filenames=filenames, 155 reader=reader, 156 reader_kwargs=reader_kwargs)
File /shared/apps/gcc-11.4.0/miniconda/3.11/envs/satpy_env/lib/python3.8/site-packages/satpy/readers/init.py:584, in load_readers(filenames, reader, reader_kwargs) 581 break 583 _check_remaining_files(remaining_filenames) --> 584 _check_reader_instances(reader_instances) 585 return reader_instances
File /shared/apps/gcc-11.4.0/miniconda/3.11/envs/satpy_env/lib/python3.8/site-packages/satpy/readers/init.py:623, in _check_reader_instances(reader_instances) 621 def _check_reader_instances(reader_instances): 622 if not reader_instances: --> 623 raise ValueError("No supported files found") 624 if not any(list(r.available_dataset_ids) for r in reader_instances.values()): 625 raise ValueError("No dataset could be loaded. Either missing " 626 "requirements (such as Epilog, Prolog) or none of the " 627 "provided files match the filter parameters.")