Closed zhanglan009 closed 5 months ago
It's shortly explain here: https://py-eddy-tracker.readthedocs.io/en/latest/grid_identification.html#shell-bash-command
Dear AntSimi, Thank you very much for your response. However, I am still a bit confused about how to set the indices. The examples it provided seem to set parameters for a single day or layer, like 20190223 with time=0 and date=datetime(2019,2,23), and I have searched through your other examples but couldn't find information on setting indices for multiple days. If my data spans from 20000101 to 20201231, representing a 20-year period, how can I set the indices, identify them, and store them in one NetCDF file? Could you provide an example? Thank you very very much. Additionally, does time=0, 20, 30 correspond to the number of days from the start-1? Lan
Could you share a ncdump of your dataset?
Yes! 20 years data are too large. I upload a 15 days data (I also email it to you cause the file size upload here still limited). The data is downloaded from https://data.marine.copernicus.eu/product/SEALEVEL_GLO_PHY_L4_NRT_008_046/.
Yes! 20 years data are too large. I upload a 15 days data. The data is downloaded from https://data.marine.copernicus.eu/product/SEALEVEL_GLO_PHY_L4_NRT_008_046/.
***@***.***
---- Replied Message ----
From
Antoine ***@***.***>
Date
2/29/2024 16:58
To
***@***.***>
Cc
***@***.***>
,
***@***.***>
Subject
Re: [AntSimi/py-eddy-tracker] batch identification of eddies over multiple days from a single netCDF file (Issue #229)
Could you share a ncdump of your dataset?
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you authored the thread.Message ID: @.***> 从 网易邮箱大师 发来的云附件 cmems_obs-sl_glo_phy-ssh_my_allsat-l4-duacs-0.25deg_P1M-m_sla_179.88W-179.88E_89.88S-89.88N_2000-01-01-2023-05-01.nc (1111.4MB,2024年3月15日 17:15 到期) 下载
i want juste header of the file with ncdump -h
OK!
netcdf cmems_obs-sl_glo_phy-ssh_nrt_allsat-l4-duacs-0.25deg_P1D_1709197734343 {
dimensions:
latitude = 720 ;
longitude = 1440 ;
time = 16 ;
variables:
float adt(time, latitude, longitude) ;
string adt:units = "m" ;
adt:_FillValue = NaN ;
string adt:standard_name = "sea_surface_height_above_geoid" ;
string adt:long_name = "Absolute dynamic topography" ;
float err_sla(time, latitude, longitude) ;
string err_sla:units = "m" ;
err_sla:_FillValue = NaN ;
string err_sla:standard_name = "sea_surface_height_above_sea_level standard_error" ;
string err_sla:long_name = "Formal mapping error" ;
float err_ugosa(time, latitude, longitude) ;
string err_ugosa:units = "m/s" ;
err_ugosa:_FillValue = NaN ;
string err_ugosa:standard_name = "surface_geostrophic_eastward_sea_water_velocity_assuming_sea_level_for_geoid standard_error" ;
string err_ugosa:long_name = "Formal mapping error on zonal geostrophic velocity anomalies" ;
float err_vgosa(time, latitude, longitude) ;
string err_vgosa:units = "m/s" ;
err_vgosa:_FillValue = NaN ;
string err_vgosa:standard_name = "surface_geostrophic_northward_sea_water_velocity_assuming_sea_level_for_geoid standard_error" ;
string err_vgosa:long_name = "Formal mapping error on meridional geostrophic velocity anomalies" ;
float flag_ice(time, latitude, longitude) ;
string flag_ice:units = "" ;
flag_ice:_FillValue = NaN ;
string flag_ice:standard_name = "status_flag" ;
string flag_ice:long_name = "Ice Flag for a 15% criterion of ice concentration" ;
float latitude(latitude) ;
string latitude:standard_name = "latitude" ;
string latitude:long_name = "Latitude" ;
string latitude:units = "degrees_north" ;
string latitude:unit_long = "Degrees North" ;
string latitude:axis = "Y" ;
latitude:valid_min = -89.875f ;
latitude:valid_max = 89.875f ;
float longitude(longitude) ;
string longitude:standard_name = "longitude" ;
string longitude:long_name = "Longitude" ;
string longitude:units = "degrees_east" ;
string longitude:unit_long = "Degrees East" ;
string longitude:axis = "X" ;
float sla(time, latitude, longitude) ;
string sla:units = "m" ;
sla:_FillValue = NaN ;
string sla:standard_name = "sea_surface_height_above_sea_level" ;
string sla:long_name = "Sea level anomaly" ;
double time(time) ;
string time:standard_name = "time" ;
string time:long_name = "Time" ;
string time:units = "seconds since 1970-01-01 00:00:00" ;
string time:calendar = "gregorian" ;
string time:axis = "T" ;
float ugos(time, latitude, longitude) ;
string ugos:units = "m/s" ;
ugos:_FillValue = NaN ;
string ugos:standard_name = "surface_geostrophic_eastward_sea_water_velocity" ;
string ugos:long_name = "Absolute geostrophic velocity: zonal component" ;
float ugosa(time, latitude, longitude) ;
string ugosa:units = "m/s" ;
ugosa:_FillValue = NaN ;
string ugosa:standard_name = "surface_geostrophic_eastward_sea_water_velocity_assuming_sea_level_for_geoid" ;
string ugosa:long_name = "Geostrophic velocity anomalies: zonal component" ;
float vgos(time, latitude, longitude) ;
string vgos:units = "m/s" ;
vgos:_FillValue = NaN ;
string vgos:standard_name = "surface_geostrophic_northward_sea_water_velocity" ;
string vgos:long_name = "Absolute geostrophic velocity: meridian component" ;
float vgosa(time, latitude, longitude) ;
string vgosa:units = "m/s" ;
vgosa:_FillValue = NaN ;
string vgosa:standard_name = "surface_geostrophic_northward_sea_water_velocity_assuming_sea_level_for_geoid" ;
string vgosa:long_name = "Geostrophic velocity anomalies: meridian component" ;
// global attributes:
string :Conventions = "CF-1.11" ;
string :title = "NRT merged all satellites Global Ocean Gridded SSALTO/DUACS Sea Surface Height L4 product and derived variables" ;
string :institution = "CLS, CNES" ;
string :source = "Altimetry measurements" ;
string :history = "2023-11-24 00:53:07Z: Creation" ;
string :contact = "servicedesk.cmems@mercator-ocean.eu" ;
string :references = "http://marine.copernicus.eu" ;
string :comment = "Sea Surface Height measured by Altimetry and derived variables" ;
string :subset\:source = "ARCO data downloaded from the Marine Data Store using the MyOcean Data Portal" ;
string :subset\:productId = "SEALEVEL_GLO_PHY_L4_NRT_008_046" ;
string :subset\:datasetId = "cmems_obs-sl_glo_phy-ssh_nrt_allsat-l4-duacs-0.25deg_P1D_202311" ;
string :subset\:date = "2024-02-29T09:08:54.343Z" ;
}
You must do an identification for each day of interest, so you could use --indexs time=i
to do identification on each time step of your netcdf, where i
is an int.
Thank you very much!! I'll be back when I try it.
Thank you very much for your response. After adding the parameter index time=i, it seems that the three-dimensional data, including time, is being read. However, when executing functions like g.display, g.bessel_high_filter, etc., I encountered an IndexError: index exceeds dimension bounds error. How can I further implement these functions?
# %%
# Load Input grid, ADT is used to detect eddies
grid_name= ("cmems_obs-sl_glo_phy-ssh_nrt_allsat-l4-duacs-0.25deg_P1D_1709197734343.nc")
g = RegularGridDataset(grid_name, "longitude","latitude", nan_masking=True,
#manual area subset
indexs=dict(longitude=slice(1141,1264),
latitude=slice(337,504),
time=360
),)
ax = start_axes("SLA (m)")
m = g.display(ax, "sla", vmin=-0.15, vmax=0.15, cmap="RdBu_r")
update_axes(ax, m)
plt.show()
---------------------------------------------------------------------------
IndexError Traceback (most recent call last)
Cell In[7], line 12
4 g = RegularGridDataset(grid_name, "longitude","latitude", nan_masking=True,
5 #manual area subset
6 indexs=dict(longitude=slice(1141,1264),
7 latitude=slice(337,504),
8 time=360
9 ),)
11 ax = start_axes("SLA (m)")
---> 12 m = g.display(ax, "sla", vmin=-0.15, vmax=0.15, cmap="RdBu_r")
13 update_axes(ax, m)
14 plt.show()
File ~/miniconda3/envs/pyed/lib/python3.10/site-packages/py_eddy_tracker/dataset/grid.py:1912, in RegularGridDataset.display(self, ax, name, factor, ref, **kwargs)
1910 if "cmap" not in kwargs:
1911 kwargs["cmap"] = "coolwarm"
-> 1912 data = self.grid(name) if isinstance(name, str) else name
1913 if ref is None:
1914 x = self.x_bounds
File ~/miniconda3/envs/pyed/lib/python3.10/site-packages/py_eddy_tracker/dataset/grid.py:547, in GridDataset.grid(self, varname, indexs)
537 dims = h.variables[varname].dimensions
538 sl = [
539 indexs.get(
540 dim,
(...)
545 for dim in dims
546 ]
--> 547 self.vars[varname] = h.variables[varname][sl]
548 if len(self.x_dim) == 1:
549 i_x = where(array(dims) == self.x_dim)[0][0]
File src/netCDF4/_netCDF4.pyx:4406, in netCDF4._netCDF4.Variable.__getitem__()
File src/netCDF4/_netCDF4.pyx:5348, in netCDF4._netCDF4.Variable._get()
IndexError: index exceeds dimension bounds
I have another problem: when I use the ORCAGrid function from your issue #6 (https://github.com/AntSimi/py-eddy-tracker/issues/6), and import an irregular gird data with a longitude range of [157, 324], after executing a, c = h.eddy_identification(*args, kwargs), attempting to plot with a.display(ax, label='Anticyclonic', color='green', kwargs_display) does not work. I can only get a blank map between [-180, 180], and anything beyond >180 cannot be plotted. if I set ax.set_xlim(220, 260), could plot the eddies, but the coastline cannot show. How could I set the index or modified the a.display function or other part? Thank you very much for your help!!!
which matplotlib version did you use?
matplotlib version is 3.7.1
I have dealing this problem. Thank you very much!
Dear Author,
Thank you very much for providing the identification program. However, I have encountered a challenge. While I can identify eddies on a single day, I am unsure about how to efficiently identify eddies over multiple days and store them in a single file for subsequent statistical analysis and tracking. Given that I need to identify eddies for approximately 20 years on a daily basis, running the program daily could be time-consuming.
Is there a program available for batch identification of eddies over multiple days from a single netCDF file (such as cmems_obs-sl_glo_phy-ssh_my_allsat-l4-duacs-0.25deg_P1M-m_sla_179.88W-179.88E_89.88S-89.88N_2000-01-01-2023-05-01), and storing the identified cyclones and anticyclones separately in a new netCDF file? I have reviewed all your FAQs but haven't found a straightforward solution, so I am reaching out to seek your guidance. I appreciate your valuable time.