Open TomLav opened 7 years ago
Hi @TomLav ,
Yes, I have run into this one too. My understanding is (iirc) that the reduce_data procedure takes the edges of the swath's coordinates and computes the min an max lons and lats from there. Problem appears when the area contains a pole, because the min or max don't correspond to the reality, and significant parts of the swath are removed. My idea was to add a check for pole inclusion in the area, in which case the min and max lons and lats would be computed from the entire coordinates instead for just the edges. But I never got around to doing it.
I am somehow doubtful that the edges/corners of the swath are the issue. I am not aware that the structure of the swath has anything to say for pyresample and I think I see the same behaviour whether I use the (scanline,scanpos) shape or after reshape(-1).
I am looking into this tonight, hopefully.
It is late (and early!). It seems data_reduce works better with two small edits. Will have to confirm tomorrow, particularly that this does not break the tests.
<pc2981|pyresample> =:D git diff data_reduce.py
diff --git a/pyresample/data_reduce.py b/pyresample/data_reduce.py
index 76e488a..a288739 100644
--- a/pyresample/data_reduce.py
+++ b/pyresample/data_reduce.py
@@ -293,10 +293,10 @@ def _get_valid_index(lons_side1, lons_side2, lons_side3, lons_side4,
#-360: area covers south pole
# 0: area covers no poles
# else: area covers both poles
- if round(angle_sum) == 360:
+ if round(angle_sum) == -360:
# Covers NP
valid_index = (lats >= lat_min_buffered)
- elif round(angle_sum) == -360:
+ elif round(angle_sum) == 360:
# Covers SP
valid_index = (lats <= lat_max_buffered)
elif round(angle_sum) == 0:
I edited this recently I think doing the opposite of what you did. So I think we can't be sure what is north or south pole actually, it depends on the data. We could use the abs value instead, and then look which side the lats are leaning towards?
On Wed, 14 Jun 2017, 01:07 Lavergne, notifications@github.com wrote:
It is late (and early!). It seems data_reduce works better with two small edits. Will have to confirm tomorrow, particularly that this does not break the tests.
<pc2981|pyresample> =:D git diff data_reduce.py diff --git a/pyresample/data_reduce.py b/pyresample/data_reduce.py index 76e488a..a288739 100644--- a/pyresample/data_reduce.py+++ b/pyresample/data_reduce.py@@ -293,10 +293,10 @@ def _get_valid_index(lons_side1, lons_side2, lons_side3, lons_side4,
-360: area covers south pole
# 0: area covers no poles # else: area covers both poles
- if round(angle_sum) == 360:+ if round(angle_sum) == -360:
Covers NP
valid_index = (lats >= lat_min_buffered)- elif round(angle_sum) == -360:+ elif round(angle_sum) == 360: # Covers SP valid_index = (lats <= lat_max_buffered)
elif round(angle_sum) == 0:
— You are receiving this because you commented.
Reply to this email directly, view it on GitHub https://github.com/pytroll/pyresample/issues/61#issuecomment-308273863, or mute the thread https://github.com/notifications/unsubscribe-auth/AAKPeiVR6NymW87E2GZW8DHnFlmcFjDXks5sDxY9gaJpZM4N4VHF .
-- Martin Raspaud, PhD Software engineer SMHI, SE-60176
Well... that would explain why I felt 1.5.0 was doing worse than before ;) I'll try to reproduce the two behaviors with more synthetic test data.
Indeed:
git show 1c9ac493aea549a354f384059e9aa6ad41558fd8
commit 1c9ac493aea549a354f384059e9aa6ad41558fd8
Author: Martin Raspaud <martin.raspaud@smhi.se>
Date: Thu Mar 16 16:50:57 2017 +0100
Fix data reduction when poles are within area
diff --git a/pyresample/data_reduce.py b/pyresample/data_reduce.py
index 7fda9fa..45465c3 100644
--- a/pyresample/data_reduce.py
+++ b/pyresample/data_reduce.py
@@ -290,14 +290,14 @@ def _get_valid_index(lons_side1, lons_side2, lons_side3, lons_side4,
# From the winding number theorem follows:
# angle_sum possiblilities:
- #-360: area covers north pole
- # 360: area covers south pole
+ # 360: area covers north pole
+ #-360: area covers south pole
# 0: area covers no poles
# else: area covers both poles
- if round(angle_sum) == -360:
+ if round(angle_sum) == 360:
# Covers NP
valid_index = (lats >= lat_min_buffered)
- elif round(angle_sum) == 360:
+ elif round(angle_sum) == -360:
# Covers SP
valid_index = (lats <= lat_max_buffered)
elif round(angle_sum) == 0:
So now we should find out test data that support your edit as well. Do you remember what type of swath it was?
A short status for my investigations:
I believe #98 fixes this, can you confirm @TomLav ?
There are two issues with data_reduce:
one when the pole is in the area, and I believe it is fixed by #98.
one when the pole is not in the area, but the area contains a segment of the dateline (-180). This one is not fixed (and is actually a legacy bug since the very beginning).
ok, good to know, thanks for the update. I'm leaving this open then.
Hello,
This is a bug I have been chasing for a while, but I have the impression it somehow is worse now (1.5.0) than before. It happens when using kdtree.resample* and reduce_data=True (which is the default) with polar orbiting satellites, on polar grids. At least the polar grids I am using for my work.
I prepared a script to generate test data and compare the remapping with and without reduce_data. It results in the image below.
To me, the reduce_data feature should only speed-up the resampling, not change the result in any way. So this is a serious flaw.
I hope this can be made rather easily into a unit test for reduce_data since it is self contained (no need for external data files to be loaded).
Where should we go from there? The github tool will not let me attach a .py. I can send it by mail. I try to paste it here but it does not look good.