ARM-DOE / pyart

The Python-ARM Radar Toolkit. A data model driven interactive toolkit for working with weather radar data.
https://arm-doe.github.io/pyart/
Other
502 stars 264 forks source link

Velocity unfolding #119

Closed kirknorth closed 9 years ago

kirknorth commented 10 years ago

The images below show some poor velocity unfolding behaviour. @scollis, I'm not sure if this was the same problem you were running into with the X-bands. Note how this is for the C-band.

From my quick investigation, there does appear to be some relation between areas of beam blockage and/or extinction and poor velocity correction. I'm also showing the uncorrected velocity field for reference.

In terms of the C-SAPR files that I processed for the May 20th, 2011 case, this issue was quite prevalent. In the 69 files that I processed between 0500 and 1300 UTC, about 23% had these artifacts.

20110520 104700

20110520 104700

20110520 104700

jjhelmus commented 10 years ago

@kirknorth I'm not seeing these issues if I dealias using the original fields in the file. Can you share your processing script? Are you using the noise corrected reflectivity as the refl field in the dealiasing? If so, I'm guessing that is the root cause of the problems as the artefacts march up well with missing gates in that field.

I want to have a good look at the FourDD algorithm in the future to implement some improvements. If I had some good example of problems to fix at the same time that would be great! I don't have the time that this requires right now, but hopefully after the AMS meeting in February.

Here are my results, the interpolatedsonde file I'm using has a history attribute of: "created by user troyan on machine engineering at 5-Jan-2012,0:40:49, using $State: zebra-zeblib-4.23-0.el5 $"

import matplotlib
matplotlib.rcParams['axes.titlesize'] = 'small'
import matplotlib.pyplot as plt
import netCDF4
import pyart

SOND_NAME = 'sgpinterpolatedsondeC1.c1.20110510.000000.cdf'
RADAR_NAME = 'sur/20110520/105235.mdv'

# read in the data
radar = pyart.io.read_mdv(RADAR_NAME)

# find and extract sonde data
target = netCDF4.num2date(radar.time['data'][0], radar.time['units'])
interp_sounde = netCDF4.Dataset(SOND_NAME)
t = pyart.correct.find_time_in_interp_sonde(interp_sounde, target)
height, speed, direction = t

# perform dealiasing
dealias_data = pyart.correct.dealias_fourdd(radar, height * 1000.0, speed,
                                            direction, target)
radar.add_field('corrected_velocity', dealias_data)

# create a plots of velocity
fig = plt.figure(figsize=(12, 15))
display = pyart.graph.RadarDisplay(radar)
for i, sweep_i in enumerate([0, 1, 3, 6, 10, 14]):
    ax = fig.add_subplot(3,2,i+1)
    display.plot('velocity', sweep_i, vmin=-20, vmax=20, ax=ax,
                 colorbar_label='m/s', fig=fig)
    display.set_limits((-120, 120), (-120, 120))
plt.tight_layout()
fig.savefig('velocity.png')

# plot corrected velocity
fig = plt.figure(figsize=(12, 15))
display = pyart.graph.RadarDisplay(radar)
for i, sweep_i in enumerate([0, 1, 3, 6, 10, 14]):
    ax = fig.add_subplot(3,2,i+1)
    display.plot('corrected_velocity', sweep_i, vmin=-20, vmax=20, ax=ax,
                 colorbar_label='m/s')
    display.set_limits((-120, 120), (-120, 120))
plt.tight_layout()
fig.savefig('corrected_velocity.png')

velocity corrected_velocity

kirknorth commented 10 years ago

@jjhelmus, let me reprocess this file again. However, the things different about my processing and yours are the following:

  1. I used the ARM merged sounding product sgpmergesonde1maceC1.c1.20110520.000000.cdf with history attribute "created by user dsmgr on machine emerald at 9-Jan-2012,3:01:51, using $State: zebra-zeblib-4.22-0.sol5_10 $". I notice that the sounding data you used was from May 10th, but the radar data is from May 20th.
  2. I had the keep_original key set to True. This shouldn't be the problem though.

And no, I did not use the noise corrected reflectivity field during dealiasing. I know this because I do that correction after calling dealias_fourdd, so it should be using the raw reflectivity data.

jjhelmus commented 10 years ago

@kirknorth Why was I using the May 10th sounding... I have no idea, I shouldn't have been. When I use the May 20th interpolatedsonde sgpinterpolatedsondeC1.c1.20110520.000000.cdf I get back results similar to yours. I'll try the mergedsonde when the ARM Archive in back online.

I'm think this shows that something is wrong with the initial guess that Py-ART is providing to the FourDD algorithm, or that the algorithms is very unstable with regards to the initial guess. I'll look into it when I have a time to really dig into the code. Thanks for pointing this out, it helps to have a "bad" case when trying to make improvements.

Just for completeness here is the new script:

import matplotlib
matplotlib.rcParams['axes.titlesize'] = 'small'
import matplotlib.pyplot as plt
import netCDF4
import pyart

SOND_NAME = 'sgpinterpolatedsondeC1.c1.20110520.000000.cdf'
RADAR_NAME = 'sur/20110520/105235.mdv'

# read in the data
radar = pyart.io.read_mdv(RADAR_NAME)

# find and extract sonde data
target = netCDF4.num2date(radar.time['data'][0], radar.time['units'])
interp_sounde = netCDF4.Dataset(SOND_NAME)
t = pyart.correct.find_time_in_interp_sonde(interp_sounde, target)
height, speed, direction = t

# perform dealiasing
dealias_data = pyart.correct.dealias_fourdd(radar, height * 1000.0, speed,
                                            direction, target,
                                            keep_original=True)
radar.add_field('corrected_velocity', dealias_data)

# create a plots of velocity
fig = plt.figure(figsize=(12, 15))
display = pyart.graph.RadarDisplay(radar)
for i, sweep_i in enumerate([0, 1, 3, 6, 10, 14]):
    ax = fig.add_subplot(3,2,i+1)
    display.plot('velocity', sweep_i, vmin=-20, vmax=20, ax=ax,
                 colorbar_label='m/s', fig=fig)
    display.set_limits((-120, 120), (-120, 120))
plt.tight_layout()
fig.savefig('velocity.png')

# plot corrected velocity
fig = plt.figure(figsize=(12, 15))
display = pyart.graph.RadarDisplay(radar)
for i, sweep_i in enumerate([0, 1, 3, 6, 10, 14]):
    ax = fig.add_subplot(3,2,i+1)
    display.plot('corrected_velocity', sweep_i, vmin=-20, vmax=20, ax=ax,
                 colorbar_label='m/s', mask_outside=False)
    display.set_limits((-120, 120), (-120, 120))
plt.tight_layout()
fig.savefig('corrected_velocity.png')

corrected_velocity

And if we expand the upper corrected velocity limit to 60 m/s:

    display.plot('corrected_velocity', sweep_i, vmin=-20, vmax=60, ax=ax,

We can see that badly unfolded regions seem to be being assigned velocities in the 20-60 m/s range:

corrected_velocity

jjhelmus commented 10 years ago

For what is is worth I get good dealiasing if I set the speed and direction to zero for all points except the first height used by FourDD:

...
height, speed, direction = t
speed[2:284] = 0
direction[2:284] = 0
...

I think there is something funny going on with the initial wind parameters from the sonde and how Py-ART or FourDD works with them.

scollis commented 10 years ago

Yes.. Issue may be with the sonde

-sent from a mobile device-

On Jan 24, 2014, at 3:41 PM, "Jonathan J. Helmus" notifications@github.com wrote:

For what is is worth I get good dealiasing if I set the speed and direction to zero for all points except the first height used by FourDD:

... height, speed, direction = t speed[2:284] = 0 direction[2:284] = 0 ... I think there is something funny going on with the initial wind parameters from the sonde and how Py-ART or FourDD works with them.

— Reply to this email directly or view it on GitHub.

kirknorth commented 10 years ago

@jjhelmus we're certainly not short on examples where the velocity unfolding behaves oddly when using the ARM merged sounding product as an initial guess.

Just for reference, here's another example, this time also showing how these artifacts end up in the gridded product.

20110520 104000

20110520 104000

scollis commented 10 years ago

I meant to bring this up on the call today.. I actually think there may me issues with merged sounding..

Scott Collis scollis.acrf@gmail.com

On Jan 27, 2014, at 7:01 PM, Kirk North notifications@github.com wrote:

@jjhelmus we're certainly not short on examples where the velocity unfolding behaves oddly when using the ARM merged sounding product as an initial guess.

Just for reference, here's another example, this time also showing how these artifacts end up in the gridded product.

— Reply to this email directly or view it on GitHub.

kirknorth commented 10 years ago

@scollis there very well could be issues with the merged sounding product. I can probably take a look at that tomorrow. However, I also checked the original (and I mean the original!) MMCG files you gave me way back when, and they show similar artifacts. And I don't believe you were using the merged sounding product as the initial guess back then...

scollis commented 10 years ago

Yes.. Depends how original we speak..

Latest do use Merged Sonde from ARM.. As a community we need to investigate this..

Scott Collis scollis.acrf@gmail.com

On Jan 27, 2014, at 7:08 PM, Kirk North notifications@github.com wrote:

@scollis there very well could be issues with the merged sounding product. I can probably take a look at that tomorrow. However, I also checked the original (and I mean the original!) MMCG files you gave me way back when, and they show similar artifacts. And I don't believe you were using the merged sounding product as the initial guess back then...

— Reply to this email directly or view it on GitHub.

jjhelmus commented 10 years ago

@kirknorth Just FYI, it would probably be good to expand your vmin/vmax limits when plotting the correct velocities. The original Doppler limits are +/- 20 m/s so any folded signals would be outside this range. Since dealiasing tried to unfold these velocities outside these limits are expected. +/- 40 or +/- 60 seem to work well when there is a single or double fold which seems to be the most common case.

kirknorth commented 10 years ago

@jjhelmus, in the first couple of lines of dealias.py I noticed that there may be some unexpected behaviour from the following lines:

refl_array = radar.fields[refl_field]['data'].reshape(nshape).astype('float32')
refl_array[np.where(refl_array == fill_value)] = rsl_badval
refl_array[np.where(np.isnan(refl_array))] = rsl_badval
refl_volume = _rsl_interface.create_volume(refl_array, 0)

The refl_array will be a NumPy masked array when extracted from the radar object and will stay a masked array, so there might be some unexpected behaviour within the RSL interface since create_volume expects a NumPy ndarray. Furthermore, the second and third lines don't do anything since refl_array is a masked array. As an example, I read in a C-SAPR file where 826,808 gates were set to the fill value, however after the second and third lines no gates are set to the RSL bad value.

Would the fix be to simply use the data attribute of the masked array? Or including the line refl_array = np.ma.filled(refl_array, fill_value)?

kirknorth commented 10 years ago

Okay so with the new (enhanced) implementation of 4DD, which attempts to use the algorithm to its full potential, that is by making use of the previous velocity volume, here's some bullet points I have about it:

The following shows corrected velocities from 1040 UTC on May 20th, which I showed in a previous post. This time though I had keep_original = False.

20110520 104000

kirknorth commented 10 years ago

The SW XSAPR consistently has a missing sector to its NE after dealias_fourdd is performed, whether last_radar is None or a previous volume. This is not addressed in PR #132.

xsaprppicmaci5 test

scollis commented 10 years ago

Wonder if it is connected to the area of clutter.. I am working on some preprocessing that may help..

kirknorth commented 10 years ago

The NW XSAPR also consistently has a missing sector to its NE after dealias_fourdd is run...

20110511 183615

scollis commented 10 years ago

Kirk.. I noticed that.. I could play with stuff to slightly reduce it.. but always had that wedge..

I wonder if I nuke some clutter in that wedge if it would fix it… This is an interesting, if irritating, issue..

Scott Collis scollis.acrf@gmail.com

On Apr 15, 2014, at 2:19 PM, Kirk North notifications@github.com wrote:

The NW XSAPR also consistently has a missing sector to its NE after dealias_fourdd is run...

— Reply to this email directly or view it on GitHub.

jjhelmus commented 10 years ago

Using only the interpolated sounding data I'm not getting a missing sector from the NW XSAPR on 2011-05-11. @kirknorth Are you using a previous volume, and did it have a missing sector?

figure_1

kirknorth commented 10 years ago

@jjhelmus, I was using both a previously dealiased volume (which has the same missing sector) and the interpolated sounding. And I still see remnants of that missing sector in your plot too :wink:

jjhelmus commented 10 years ago

I think I'll start with trying to explain this plot. The lowest sweep from the NW XSAPR (I6) using only sounding data. Reflectivity and Bergen and Albers filter turned off, yet the algorithm still masks the noon to ~1 o'clock sector.

figure_1

Code for my own records and if anyone is playing along at home..

import matplotlib
matplotlib.rcParams['axes.titlesize'] = 'small'
import matplotlib.pyplot as plt
import netCDF4
import pyart

SOND_NAME = 'sgpinterpolatedsondeC1.c1.20110511.000000.cdf'
RADAR_NAME = 'data/XNW110511181804.RAW7CJL'

# read in the data
radar = pyart.io.read_sigmet(RADAR_NAME)
radar = radar.extract_sweeps([0])

# find and extract sonde data
target = netCDF4.num2date(radar.time['data'][0], radar.time['units'])
interp_sounde = netCDF4.Dataset(SOND_NAME)
t = pyart.correct.find_time_in_interp_sonde(interp_sounde, target)
height, speed, direction = t

# perform dealiasing
dealias_data = pyart.correct.dealias_fourdd(
    radar, sounding_heights = height * 1000.0, sounding_wind_speeds=speed,
    sounding_wind_direction=direction, keep_original=False,
    prep=0, filt=0)
radar.add_field('corrected_velocity', dealias_data)

# create a plots of velocity
fig = plt.figure(figsize=(10, 4))
ax1 = fig.add_subplot(121)
ax2 = fig.add_subplot(122)
display = pyart.graph.RadarDisplay(radar)
display.plot('velocity', 0, vmin=-20, vmax=20, ax=ax1,
             colorbar_label='m/s', fig=fig)
display.plot('corrected_velocity', 0, vmin=-34, vmax=34, ax=ax2,
             colorbar_label='m/s', mask_outside=False)

plt.show()
#fig.savefig('test.png')
kirknorth commented 10 years ago

Yes definitely start with the simplest case first. During your search, can you please verify and/or check these points:

  1. Does setting NODBZRMRV = 1 remove all gates where reflectivity is missing in your corrected velocity field like it's supposed to?
  2. Try values of CKVAL in the interval [1, 2]
  3. Try values of DELNUM in the interval [0, 5]
  4. Try values of THRESH in the interval [0.1, 0.4]
  5. Try values of COMPTHRESH2 in the interval [0.2, 0.49]
jjhelmus commented 10 years ago

@kirknorth Try changing the MAXRAYS constant in FourDD.h line 20 to something like 500 and then recompile _fourdd_interface.so. From the few cases I've tried this solved the missing sector problem. It makes some sense for the XSAPRs which have 360 or 400 rays per sweep so with a limits of 350 were indexing the GOOD array past it's limits along the last dimension which would lead to problems. I'll play with this some more tomorrow and see what is does for some of the other issues we've been seeing.

figure_1

kirknorth commented 10 years ago

Nice work @jjhelmus. I hadn't thought to look at the MAXRAYS constant since it is not very well documented in the header file and therefore didn't really know its purpose.

By the way, did you change any of the other constants to produce that latest plot? Comparing it to your previous plot (18:36 UTC) I see that not only is the NE sector filled in but also all of the other missing areas. This is promising!

kirknorth commented 10 years ago

I've checked both the SW and NW X-bands and I'm quite certain that increasing MAXRAYS does get rid of our missing sector issues for both these radars. So we can tick that off the list. For the record I changed its original value of 375 to 500.

That being said, it does not help with the spurious velocities issue like that showcased in the first post of this issue as shown below. I have seen these artifacts appear when using only interpolated sounding and when using both a previous volume and interpolated sounding.

test

jjhelmus commented 10 years ago

@kirknorth

The answer to question 1 about the NODBZRMRV parameter:

As the algorithms exists now NODBZRMRV controls if velocity gates where the corresponding reflectivity is less than -50.0 dBZ or greater than HIGHDBZ are marked as missing value. When NODBZRMRV = 1 they are marked, otherwise they are left as is. Yhis only applied when prep is 1. Regardless of the value of NODBZRMRV, gates with reflectivities greater than or equal to -50.0 dBZ and less than LOWDBZ are always marked as missing (with prep is 1).

Obviously this is not what should be happening during volume prep. Once I'm done with my refactor I'll be changing the code so that gates outside of a pre-defined high/low reflectivity and marked with another flag allow for marking when the reflectivity is missing. I don't want to change it yet as my regression tests that I'm using to refactor will fail.

Also I'm planning of not including the DELNUM parameter in at least the C code. Removing the first few gates is easier to do in Python either before calling the dealiasing routine or as an option in the Python function.

kirknorth commented 10 years ago

Okay this makes some sense since when I set NODBZRMRV = 1 it does not remove gates that I've flagged as bad, i.e. with _FillValue (which then get set to the RSL bad value). Instead it only appeared to be removing gates with low reflectivity values.

Now removing noisy gates is vital since the solution 4DD finds can become unstable and produce velocity values 3-4x the Nyquist within these regions and this can infiltrate cloud boundaries. Here's an example of that problem manifesting itself in gridded data:

20110425 080520

A good despeckling algorithm (like that in PR #133) can remove the isolated pixels near the radar, so I'm not worried about that. However, near the cloud boundary at (x, y) = (20, -10) in the z = 0, 1.5 km panels you can see an obvious poor correction. This is a direct result of the surrounding noisy region not being flagged as bad and its solution infiltrating the cloud edge during 4DD. This is not a result of the radius of influence during gridding capturing bad pixels within the noise, as the gates in the radial data show this as well.

jjhelmus commented 10 years ago

One option for filtering out bad gates until this is fixed in the algorithm is to do the masking before calling the dealias function. Something like

radar.fields['velocity']['data'][radar.fields['reflectivity']['data'].mask] = np.ma.masked

Would mask velocities where at "bad" reflectivity gates. You do lose the velocity data there, but if it is needed you can make a copy before masking.

kirknorth commented 10 years ago

Hmm, does that even work? I removed the noise first from the reflectivity field then did

is_bad = radar.fields['corrected_reflectivity']['data'].mask
radar.fields['velocity']['data'].mask = is_bad

and this was the result after 4DD with sounding ICs only...

test

You can see the original velocity field has the proper mask applied, but the corrected field in fact still retained the original velocity data which 4DD tried to correct. Leads me to believe we're not handling missing values properly.

scollis commented 10 years ago

BTW: all who are on and following this thread.. Now is the time to send @jjhelmus your bad velocity data for testing

nguy commented 10 years ago

@kirknorth I'm just looking at this conversation, but I've experienced some bad masking of data points. Not sure if this applies in this case, but I had a failure trying to apply the mask from one field to another as you showed in the code an hour ago. I had to use something like this to make it work: radar.fields['velocity']['data'] = np.ma.masked_where(is_bad,radar.fields['velocity']['data'].mask)

jjhelmus commented 10 years ago

I wonder if this is caused by the FourDD code using an equality to test for a floating point missingValue. With roundoff and numerical precision issues using == with float is a bad idea.

nguy commented 10 years ago

There is some literature in the numpy masked that says an equality should be tested using np.ma.masked_value (if I remember correctly).

kirknorth commented 10 years ago

@jjhelmus yes one thought of mine was that there was a loss of precision between the C source code and the floating point rsl_badval = 131072.0

jjhelmus commented 10 years ago

@kirknorth You might want to have a play with my fourdd_enh branch. It is the results of refactoring the FourDD code, fixes the reflectivity filtering issue, and allow you to set a bunch of dealiasing parameters at run time to see the effects of changing them. It doesn't seem to solve the high velocities seen in the original May, 20, 2011 case, but you might be able to improve some cases.

jjhelmus commented 10 years ago

An IPython notebook which looks at why FourDD fails to dealias the May 20, 2011 CSAPR volume we have been looking at. Basically the algorithm is too greedy in finding gates which agree with the sounding data and if the gates agree with the sounding they are taken to be corrected even if later this is a bad choice.

scollis commented 10 years ago

Great stuff @jjhelmus Tricky to think how to progress.. If we can identify high gradients, mark contiguous regions and use that as a first guess for a second pass?

On 5/12/14, 5:24 PM, Jonathan J. Helmus wrote:

An IPython notebook http://nbviewer.ipython.org/github/jjhelmus/dealias_notebooks/blob/master/FourDD_fail_May_20_2011.ipynb which looks at why FourDD fails to dealias the May 20, 2011 CSAPR volume we have been looking at. Basically the algorithm is too greedy in finding gates which agree with the sounding data and if the gates agree with the sounding they are taken to be corrected even if later this is a bad choice.

— Reply to this email directly or view it on GitHub https://github.com/ARM-DOE/pyart/issues/119#issuecomment-42896262.

kirknorth commented 10 years ago

Yes @scollis, either as a second pass or initial conditions. And great stuff @jjhelmus. I have already fetched and checked out your fourdd_enh branch, and I will run some experiments.

jjhelmus commented 10 years ago

@kirknorth I've started to work on a Doppler dealiasing routine which uses multi-dimensional phase unwrapping for the initial (and currently only) unfolding. The technique looks to have promise, code is available in the dealias_phase_unwrap branch.

Using unwrap_unit='sweep' seems to give the best results. Also not that the algorithm should respect any masking applied to the velocity field, so masking out gates with low reflectivity or normalized coherent power seems to clean up some clutter issues.

Example of dealiasing the top most sweep from a CSAPR volume that gave FourDD issues:

import pyart
import numpy as np

radar = pyart.io.read('105235.mdv')
corr_vel = pyart.correct.dealias_unwrap_phase(radar, unwrap_unit='sweep')
radar.add_field('corrected_velocity', corr_vel)

import matplotlib.pyplot as plt
display = pyart.graph.RadarDisplay(radar)
display.plot_ppi('corrected_velocity', 16, vmin=-40, vmax=40)
display.set_limits((-20, 20), (-20, 20))
plt.show()

figure_1

kirknorth commented 10 years ago

@jjhelmus awesome work, I'll fetch your testing branch and put it through the ringer since I've got a lot of SAPR volumes that 4DD struggled with. The May 24th case was quite a difficult case to dealias, e.g.,

20110524 222729

nguy commented 10 years ago

Hi Kirk,

Is this data from the MC3E project? I’ll be beginning some work in Sep with data from that project. Good to know the people familiar with data quirks already! Nick

On Jul 17, 2014, at 2:09 PM, Kirk North notifications@github.com wrote:

@jjhelmus awesome work, I'll fetch your testing branch and put it through the ringer since I've got a lot of SAPR volumes that 4DD struggled with. The May 24th case was quite a difficult case to dealias.

— Reply to this email directly or view it on GitHub.

nguy commented 10 years ago

Oops, ignore, that was for Kirk. Sorry, Nick

On Jul 17, 2014, at 2:09 PM, Kirk North notifications@github.com wrote:

@jjhelmus awesome work, I'll fetch your testing branch and put it through the ringer since I've got a lot of SAPR volumes that 4DD struggled with. The May 24th case was quite a difficult case to dealias.

— Reply to this email directly or view it on GitHub.

kirknorth commented 10 years ago

@nguy yes sir! MC3E ran from ~ April 20th to ~ June 6th, 2011.

nguy commented 10 years ago

@kirknorth That's great! I might be in contact with you in the future depending on which cases I use. Are you just looking at the SAPR?

kirknorth commented 10 years ago

Yes, for now my focus is on the SAPR data, using it for 3D-VAR wind retrievals.

kirknorth commented 10 years ago

@jjhelmus, I tested your new branch on a series of CSAPR data from May 24th, 2011 covering ~ 4 hours. I have three comments so far.

  1. From the testing I did, the new algorithm does not introduce spurious velocities. By spurious I mean like the previous artifacts we were seeing in 4DD, defined by discontinuities in the data that were obviously artifacts (e.g., see first comment post). So :+1:
  2. In its current form, the algorithm suffers from no first guess velocity field (e.g., sounding data). Please see this example for reference. Note how from one volume to the next the solution, while physically reasonable for each tilt, is quite discontinuous (e.g., the 6.4 degree elevation angle).
  3. I attempted to mask bad pixels first before calling dealias_phase_unwrap, but my mask wasn't being recognized. You can see this in the above link, where masked data appeared to be taking on multiple values too. However I'm not particularly worried about this point, as I may vary well be making a trivial mistake. The code I was using was of the form radar.fields['velocity']['data'][not_coherent] = np.ma.masked.
jjhelmus commented 10 years ago

I agree with point 1, and I think the masking can be figured out. I didn't spend much time working out if all the masking but will in a more formalized version.

I think a lot of point 2 comes from the fact the algorithm defaults to unwrapping a single sweep at a time You can override this using using unwrap_unit='volume' but I would not recommend this at the moment. Sweep-by-sweep dealiasing seems to be prone to introducing discontinuities between adjacent sweeps.

Taking into account the 3D structure (or even the 4D nature if possible) should improve this greatly. There is a 3D unwrapping implementation in the branch but it seems to have some bugs with specifying which axes have periodic boundaries (causes seg faults on my machine). I addition the implementation assume a regular rectangular spacing which isn't correct for radar volumes. I have some ideas on how to address this. Never-the-less the technique looks promising.

jjhelmus commented 9 years ago

Two dealiasing routines have been added to Py-ART since the last update to this issue, dealias_unwrap_phase and dealias_region_based. In testing both of these seem to give more acceptable unfolded velocities than the existing FourDD algorithms. Neither uses a sounding or other external data for a first guess to the velocity field which may result in the entire volume being under or over-folded. Adding a method to compare an unfolded volume against a vertical wind profile and then applying a global number of folds to minimize the different should connect the results of these algorithms with reality of the atmosphere. With this addition I think this issue can be closed.

scollis commented 9 years ago

Agreed.. We can tie the original issue to issues with 4DD which we discovered had some fundamental flaws.. If there are new issues raised they can be targeted against one of the two algorithms..

in addition I encourage all to actively test these and let us know about performance issues. It is planned to do a (speed) optimization effort on the segmentation method down the track..

On 3/3/15 9:35 AM, Jonathan J. Helmus wrote:

Two dealiasing routines have been added to Py-ART since the last update to this issue, |dealias_unwrap_phase| http://arm-doe.github.io/pyart-docs-travis/user_reference/generated/pyart.correct.dealias_unwrap_phase.html and |dealias_region_based| http://arm-doe.github.io/pyart-docs-travis/user_reference/generated/pyart.correct.dealias_region_based.html. In testing both of these seem to give more acceptable unfolded velocities than the existing FourDD algorithms. Neither uses a sounding or other external data for a first guess to the velocity field which may result in the entire volume being under or over-folded. Adding a method to compare an unfolded volume against a vertical wind profile and then applying a global number of folds to minimize the different should connect the results of these algorithms with reality of the atmosphere. With this addition I think this issue can be closed.

— Reply to this email directly or view it on GitHub https://github.com/ARM-DOE/pyart/issues/119#issuecomment-76968100.

jjhelmus commented 9 years ago

Closing this issue, feel free to reopen if further discussion is needed.