Closed msw17002 closed 4 weeks ago
Can you share the code that gives you az
and ref_range
?
If this came from our examples, those arrays are intentionally bigger than the data
such that you are working with the coordinates for the corners of the quadrilateral that bounds the area that has a value of data[i, j]
; this is what's truly needed to feed to pcolormesh
.
I added my sample script to this email... I did copy it from one of the examples (https://unidata.github.io/MetPy/latest/examples/formats/NEXRAD_Level_2_File.html#sphx-glr-examples-formats-nexrad-level-2-file-py). If the radar file doesn't upload to this reply, you can download it via this link: https://noaa-nexrad-level2.s3.amazonaws.com/2021/12/21/KTBW/KTBW20211221_104123_V06.
This is the important part you're asking for,
#===radar file
radar = Level2File(inn)
#---retrieve base fields
cent_lon = radar.sweeps[0][0][1].lon
cent_lat = radar.sweeps[0][0][1].lat
#-Pull data out of the file
sweep = 0
#-First item in ray is header, which has azimuth angle
az = np.array([ray[0].az_angle for ray in radar.sweeps[sweep]])
diff = np.diff(az)
crossed = diff < -180
diff[crossed] += 360.
avg_spacing = diff.mean()
#-Convert mid-point to edge
az = (az[:-1] + az[1:]) / 2
az[crossed] += 180.
#-Concatenate with overall start and end of data we calculate using the average spacing
az = np.concatenate(([az[0] - avg_spacing], az, [az[-1] + avg_spacing]))
az = units.Quantity(az, 'degrees')
#-5th item is a dict mapping a var name (byte string) to a tuple
# of (header, data array)
ref_hdr = radar.sweeps[sweep][0][4][b'REF'][0]
ref_range = (np.arange(ref_hdr.num_gates + 1) - 0.5) * ref_hdr.gate_width + ref_hdr.first_gate
ref_range = units.Quantity(ref_range, 'kilometers')
ref = np.array([ray[4][b'REF'][1] for ray in radar.sweeps[sweep]])
#-Turn into an array, then mask
data = np.ma.array(ref)
data[np.isnan(data)] = np.ma.masked
#-Convert az,range to x,y
xlocs, ylocs = azimuth_range_to_lat_lon(az, ref_range, cent_lon, cent_lat)
I later attempt to determine the base reflectivity at a specific location given;
#-determine smallest distance
row = np.nanargmin(((xlocs.flatten()-qlon)**2+(ylocs.flatten()-qlat)**2)**0.5)
val = data.flatten()[row]
if np.ma.isMaskedArray(val): val = str(np.nan)
else: val = str(int(val))
I'd be satisfied with the above if the dimensions of xlocs==ylocs==data.
To accomplish what you want would require changing the range calculation, which is intentionally adding 1 to the size to:
ref_range = np.arange(ref_hdr.num_gates) * ref_hdr.gate_width + ref_hdr.first_gate
ref_range = units.Quantity(ref_range, 'kilometers')
Then also avoid modifying the collection of azimuths, but only have this line:
az = np.array([ray[0].az_angle for ray in radar.sweeps[sweep]])
eliminate all the other lines that affect az.
Thank you for your help, dopplershift!
Here's my code for anyone seeking help:
radar = Level2File(inn)
#=================================================================================================================
#---retrieve base fields
cent_lon = radar.sweeps[0][0][1].lon
cent_lat = radar.sweeps[0][0][1].lat
#-Pull data out of the file
sweep = 0
#-First item in ray is header, which has azimuth angle
az = np.array([ray[0].az_angle for ray in radar.sweeps[sweep]])
az = units.Quantity(az, 'degrees')
#-5th item is a dict mapping a var name (byte string) to a tuple
# of (header, data array)
ref_hdr = radar.sweeps[sweep][0][4][b'REF'][0]
ref_range = np.arange(ref_hdr.num_gates) * ref_hdr.gate_width + ref_hdr.first_gate
ref_range = units.Quantity(ref_range, 'kilometers')
#-Convert az,range to x,y
xlocs, ylocs = azimuth_range_to_lat_lon(az, ref_range, cent_lon, cent_lat)
#=================================================================================================================
#-reflectivity field
ref = np.array([ray[4][b'REF'][1] for ray in radar.sweeps[sweep]])
#-Turn into an array, then mask
data = np.ma.array(ref)
data[np.isnan(data)] = np.ma.masked
#---convert into a pandas dataframe
pairs = pd.DataFrame({'Lat': ylocs.flatten(), 'Lon': xlocs.flatten(), 'reflectivity': data.flatten()})
pairs = geopandas.GeoDataFrame(pairs, geometry=geopandas.points_from_xy(pairs.Lon, pairs.Lat), crs="EPSG:4326")
pairs = pairs.clip(quer.to_crs(pairs.crs)).reset_index(drop=True)
pairs.to_csv("testing.csv",index=False)
#-determine value at loi
row = np.nanargmin(((pairs['Lon']-qlon)**2+(pairs['Lat']-qlat)**2)**0.5)
#-spatial statistics
if all(np.isnan(pairs['reflectivity'])): val,p50N,maxv,mew,std = "U/A","U/A","U/A","U/A","U/A"
else:
#-ncells >=50
p50N = str(np.sum(pairs['reflectivity']>=50))
#-max value
maxv = str(np.nanmax(pairs['reflectivity']))
#-mean
mew = str(np.round(np.nanmean(pairs['reflectivity']),1))
#-std
std = str(np.round(np.nanstd(pairs['reflectivity']),1))
#-value at LOI
val = pairs['reflectivity'][row]
if np.isnan(val): val = "U/A"
else: val = str(int(val))
#-calculate distance to point
dis = str(np.round(geopy.distance.geodesic((qlat,qlon), (pairs['Lat'][row],pairs['Lon'][row])).miles,3))
#---get time
dt = datetime.datetime.strptime(os.path.basename(inn)[4:-4],"%Y%m%d_%H%M%S")
title = name[0:4]+": "+dt.strftime("%Y-%m-%dT%H:%M:%SZ")+"\n"+"Base Reflectivity"
#===finally generate image
fig = plt.figure(figsize=(8,6))
ax = plt.subplot(111, projection=mapi.crs)
#-zoom to loi
ax.set_extent([W,E,S,N])
#-set scale of overlay
#ax.add_image(mapi, int(scale)) # add OSM with zoom specification
#===plot analysis line
plt.plot(x , y , color='k' , linestyle='dashed', linewidth=1, zorder=5, transform=ccrs.PlateCarree())
plt.plot(x_q, y_q, color='olive', linestyle='solid' , linewidth=1, zorder=5, transform=ccrs.PlateCarree())
#===add table text
table_0 = "Spatial Statistics (r;"+str(perim)+" miles)"
table_m0 = "LOI:\nDis. N-cell:\n"
table_m1 = val+" dBZ\n"+dis+" miles\n"
table_p0 = "\n\nN ≥50 (r;"+str(perim)+"):\nMax (r;"+str(perim)+"):\nµ (r;"+str(perim)+"):\nσ (r;"+str(perim)+"):"
table_p1 = "\n\n"+p50N+"\n"+maxv+" dBZ\n"+mew+" dBZ\n"+std+" dBZ"
tables = [[table_m0,"m",0],
[table_m1,"m",1],
[table_p0,"olive",0],
[table_p1,"olive",1]]
for table in tables:
if table[2]==0: ax.text(0.01,0.99,table[0],c=table[1],va="top",ha="left",zorder=5,transform=ax.transAxes,path_effects=[pe.withStroke(linewidth=1, foreground="w")])
else: ax.text(0.20,0.99,table[0],c=table[1],va="top",ha="left",zorder=5,transform=ax.transAxes,path_effects=[pe.withStroke(linewidth=1, foreground="w")])
tables = [["Image Extent:\n" ,"k" ,0],
["r; 15 miles\n" ,"k" ,1],
["\nSpatial Statistics:" ,"olive",0],
["\nr; "+str(perim)+" miles","olive",1]]
for table in tables:
if table[2]==0: ax.text(0.01 ,0.01,table[0],c=table[1],va="bottom",ha="left",zorder=5,transform=ax.transAxes,path_effects=[pe.withStroke(linewidth=1, foreground="w")])
else: ax.text(0.275,0.01,table[0],c=table[1],va="bottom",ha="left",zorder=5,transform=ax.transAxes,path_effects=[pe.withStroke(linewidth=1, foreground="w")])
#===shaded image, shapefiles
ax.pcolormesh(xlocs, ylocs, data, cmap=cmap,norm=norm, transform=ccrs.PlateCarree())
#===plot radar site
lat_0 = cent_lat
lon_0 = cent_lon
ax.text(0.99,0.01,name[0:4],c='red',va="bottom",ha="right",fontweight='bold',zorder=5,transform=ax.transAxes,path_effects=[pe.withStroke(linewidth=1, foreground="w")])
plt.scatter(lon_0,lat_0,s=50,c='r',ec='k',marker='o',linewidths=1,zorder=5,transform=ccrs.PlateCarree())
#===plot query point
ax.text(0.99,0.99,"LOI <"+str(qlat)+","+str(qlon)+">",c='magenta',va="top",ha="right",fontweight='bold',zorder=5,transform=ax.transAxes,path_effects=[pe.withStroke(linewidth=1,foreground="w")])
plt.scatter(float(qlon),float(qlat),s=25,c='magenta',ec='k',marker='o',linewidths=1,zorder=5,transform=ccrs.PlateCarree())
#===plot title
ax.set_title(title)
#-colorbar
make_colorbar(ax,img)
#===gridlines
gl = ax.gridlines(crs=ccrs.PlateCarree(),draw_labels=True,x_inline=False,y_inline=False,linewidth=1, color='gray', alpha=0.5, linestyle='--')
gl.top_labels = False
gl.right_labels = False
gl.xformatter = LONGITUDE_FORMATTER
gl.yformatter = LATITUDE_FORMATTER
gl.ylabel_style = {'size': 10, 'color': 'blue'} #
gl.xlabel_style = {'size': 10, 'color': 'red', 'rotation': 0}
#===adjust layout then save image
plt.tight_layout()
plt.savefig(png,dpi=100,transparent=True)
plt.close(fig=None)
##===overlay plots
##-background
#img1 = Image.open(path+"satellite.png")
##-overlay
#img2 = Image.open(png)
##-combine then save
#img1.paste(img2, (0,0), mask = img2)
#img1.save(png)
Closing as resolved. Feel free to re-open if that's not the case.
What can be better?
I'm trying to pair NEXRAD coordinates into a pandas (then geopandas) dataframe.
To do this, I, 1) Read a NEXRAD level II file using MetPy: radar = Level2File(inn) 2) Calculate the lat/lon coordinates using MetPy (at sweep 0): xlocs, ylocs = azimuth_range_to_lat_lon(az, ref_range, cent_lon, cent_lat) 3) Determine base reflectivity (sweep = 0): The calculation is long, but base reflectivity == data 4) Then flatten xlocs, ylocs, and data into a pandas dataframe: pairs = pd.DataFrame({'Lat': ylocs.flatten(), 'Lon': xlocs.flatten(), 'reflectivity': data.flatten()})
The problem is, data.flatten() does not have the same dimensionality as ylocs.flatten()/xlocs.flatten(). For my case, the non-flattened dimensions are; ylocs = (721, 1833), xlocs = (721, 1833), and data = (720, 1832).
Bottom line, is there a way to pair ylocs/xlocs onto data's grid so that I can index what the coordinates are at cell data(x,y)? Since it was able to plot in pcolormesh, I'm sure there's a way. I just need to find it.
Thank you!