E Values are different.
E
E Dataset variable: band_data
E 146/164,799,882 (0.00%) values are different.
E Reference - secondary:
E max 0.0021422524005174637; min -0.00478113628923893; mean -3.4277951040211146e-11;
E std 5.75270218006575e-07; var 3.3093582372533226e-13
And the browse images look identical:
Checking the two vv.tif files reveals that the maximum value of the difference between two vv data are 0.020935535 at the pixel[9131,5576].
For InSAR, we also specify the number of pixels that are allowed to fail, whereas we don't do that for RTC.
We should consider specifying the number of pixels that are allowed to fail for RTC, and/or changing the other tolerance settings so that the 20m RTC golden test passes.
The 20m RTC golden test fails, as shown here: https://github.com/ASFHyP3/hyp3-testing/actions/runs/4896865581/jobs/8744197447
Here are the two jobs from that test:
The differences are very small:
And the browse images look identical:
Checking the two vv.tif files reveals that the maximum value of the difference between two vv data are 0.020935535 at the pixel[9131,5576].
RTC tolerances are set here: https://github.com/ASFHyP3/hyp3-testing/blob/develop/tests/conftest.py#L92-L105
For reference, InSAR tolerances are set here: https://github.com/ASFHyP3/hyp3-testing/blob/develop/hyp3_testing/templates/insar_gamma_golden.json.j2#L18C1-L59
For InSAR, we also specify the number of pixels that are allowed to fail, whereas we don't do that for RTC.
We should consider specifying the number of pixels that are allowed to fail for RTC, and/or changing the other tolerance settings so that the 20m RTC golden test passes.